A Study of Novice Systems Analysis Problem Solving Behaviors Using Protocol Analysis
1992-09-01
conducted. Each subject was given the same task to perform. The task involved a case study (Appendix B) of a utility company’s customer order processing system...behavior (Ramesh, 1989). The task was to design a customer order processing system that utilized a centralized telephone answering service center...of the utility company’s customer order processing system that was developed based on information obtained by a large systems consulting firm during
Understanding Processes and Timelines for Distributed Photovoltaic
data from more than 30,000 PV systems across 87 utilities in 16 states to better understand how solar photovoltaic (PV) interconnection process time frames in the United States. This study includes an analysis of Analysis Metrics" that shows the four steps involved in the utility interconnection process for solar
FTOOLS: A FITS Data Processing and Analysis Software Package
NASA Astrophysics Data System (ADS)
Blackburn, J. K.
FTOOLS, a highly modular collection of over 110 utilities for processing and analyzing data in the FITS (Flexible Image Transport System) format, has been developed in support of the HEASARC (High Energy Astrophysics Science Archive Research Center) at NASA's Goddard Space Flight Center. Each utility performs a single simple task such as presentation of file contents, extraction of specific rows or columns, appending or merging tables, binning values in a column or selecting subsets of rows based on a boolean expression. Individual utilities can easily be chained together in scripts to achieve more complex operations such as the generation and displaying of spectra or light curves. The collection of utilities provides both generic processing and analysis utilities and utilities specific to high energy astrophysics data sets used for the ASCA, ROSAT, GRO, and XTE missions. A core set of FTOOLS providing support for generic FITS data processing, FITS image analysis and timing analysis can easily be split out of the full software package for users not needing the high energy astrophysics mission utilities. The FTOOLS software package is designed to be both compatible with IRAF and completely stand alone in a UNIX or VMS environment. The user interface is controlled by standard IRAF parameter files. The package is self documenting through the IRAF help facility and a stand alone help task. Software is written in ANSI C and \\fortran to provide portability across most computer systems. The data format dependencies between hardware platforms are isolated through the FITSIO library package.
Telecommunication market research processing
NASA Astrophysics Data System (ADS)
Dupont, J. F.
1983-06-01
The data processing in two telecommunication market investigations is described. One of the studies concerns the office applications of communication and the other the experiences with a videotex terminal. Statistical factorial analysis was performed on a large mass of data. A comparison between utilization intentions and effective utilization is made. Extensive rewriting of statistical analysis computer programs was required.
Uncertainty Budget Analysis for Dimensional Inspection Processes (U)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Valdez, Lucas M.
2012-07-26
This paper is intended to provide guidance and describe how to prepare an uncertainty analysis of a dimensional inspection process through the utilization of an uncertainty budget analysis. The uncertainty analysis is stated in the same methodology as that of the ISO GUM standard for calibration and testing. There is a specific distinction between how Type A and Type B uncertainty analysis is used in a general and specific process. All theory and applications are utilized to represent both a generalized approach to estimating measurement uncertainty and how to report and present these estimations for dimensional measurements in a dimensionalmore » inspection process. The analysis of this uncertainty budget shows that a well-controlled dimensional inspection process produces a conservative process uncertainty, which can be attributed to the necessary assumptions in place for best possible results.« less
Model prototype utilization in the analysis of fault tolerant control and data processing systems
NASA Astrophysics Data System (ADS)
Kovalev, I. V.; Tsarev, R. Yu; Gruzenkin, D. V.; Prokopenko, A. V.; Knyazkov, A. N.; Laptenok, V. D.
2016-04-01
The procedure assessing the profit of control and data processing system implementation is presented in the paper. The reasonability of model prototype creation and analysis results from the implementing of the approach of fault tolerance provision through the inclusion of structural and software assessment redundancy. The developed procedure allows finding the best ratio between the development cost and the analysis of model prototype and earnings from the results of this utilization and information produced. The suggested approach has been illustrated by the model example of profit assessment and analysis of control and data processing system.
An Array of Qualitative Data Analysis Tools: A Call for Data Analysis Triangulation
ERIC Educational Resources Information Center
Leech, Nancy L.; Onwuegbuzie, Anthony J.
2007-01-01
One of the most important steps in the qualitative research process is analysis of data. The purpose of this article is to provide elements for understanding multiple types of qualitative data analysis techniques available and the importance of utilizing more than one type of analysis, thus utilizing data analysis triangulation, in order to…
An Individual Differences Analysis of Double-Aspect Stimulus Perception.
ERIC Educational Resources Information Center
Forsyth, G. Alfred; Huber, R. John
Any theory of information processing must address both what is processed and how that processing takes place. Most studies investigating variables which alter physical dimension utilization have ignored the large individual differences in selective attention or cue utilization. A paradigm was developed using an individual focus on information…
Utility accommodation and conflict tracker (UACT) : user manual
DOT National Transportation Integrated Search
2009-02-01
Project 0-5475 performed a comprehensive analysis of utility conflict data/information flows between utility : accommodation stakeholders in the Texas Department of Transportation project development process, : developed data models to accommodate wo...
ERIC Educational Resources Information Center
Øye, Christine; Mekki, Tone Elin; Skaar, Randi; Dahl, Hellen; Forland, Oddvar; Jacobsen, Frode F.
2015-01-01
Knowledge utilization is politically "hot" because it informs decisions on improving the quality of care in nursing homes (NHs). The difficulties encountered in implementing evidence-based knowledge into practice may be explained by contextual factors. Contextual factors are crucial to understanding the process of knowledge utilization;…
Utility accommodation and conflict tracker (UACT) installation and configuration manual.
DOT National Transportation Integrated Search
2009-02-01
Project 0-5475 performed a comprehensive analysis of utility conflict data/information flows between utility : accommodation stakeholders in the Texas Department of Transportation project development process, : developed data models to accommodate wo...
Analysis of metabolic energy utilization in the Skylab astronauts
NASA Technical Reports Server (NTRS)
Leonard, J. I.
1977-01-01
Skylab biomedical data regarding man's metabolic processes for extended periods of weightlessness is presented. The data was used in an integrated metabolic balance analysis which included analysis of Skylab water balance, electrolyte balance, evaporative water loss, and body composition. A theoretical analysis of energy utilization in man is presented. The results of the analysis are presented in tabular and graphic format.
Methods utilized in evaluating the profitability of commercial space processing
NASA Technical Reports Server (NTRS)
Bloom, H. L.; Schmitt, P. T.
1976-01-01
Profitability analysis is applied to commercial space processing on the basis of business concept definition and assessment and the relationship between ground and space functions. Throughput analysis is demonstrated by analysis of the space manufacturing of surface acoustic wave devices. The paper describes a financial analysis model for space processing and provides key profitability measures for space processed isoenzymes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
This document containes reports from the proceedings of the 1995 U.S. DOE hydrogen program review. Reports are organized under the topics of systems analysis, utilization, storage, and production. This volume, Volume I, contains the reports concerned with systems analysis and utilization. Individual reports were processed separately for the DOE data bases.
NASA Technical Reports Server (NTRS)
Barrientos, Francesca; Castle, Joseph; McIntosh, Dawn; Srivastava, Ashok
2007-01-01
This document presents a preliminary evaluation the utility of the FAA Safety Analytics Thesaurus (SAT) utility in enhancing automated document processing applications under development at NASA Ames Research Center (ARC). Current development efforts at ARC are described, including overviews of the statistical machine learning techniques that have been investigated. An analysis of opportunities for applying thesaurus knowledge to improving algorithm performance is then presented.
Decision Making Methods in Space Economics and Systems Engineering
NASA Technical Reports Server (NTRS)
Shishko, Robert
2006-01-01
This viewgraph presentation reviews various methods of decision making and the impact that they have on space economics and systems engineering. Some of the methods discussed are: Present Value and Internal Rate of Return (IRR); Cost-Benefit Analysis; Real Options; Cost-Effectiveness Analysis; Cost-Utility Analysis; Multi-Attribute Utility Theory (MAUT); and Analytic Hierarchy Process (AHP).
Vocational Education Operations Analysis Process.
ERIC Educational Resources Information Center
California State Dept. of Education, Sacramento. Vocational Education Services.
This manual on the vocational education operations analysis process is designed to provide vocational administrators/coordinators with an internal device to collect, analyze, and display vocational education performance data. The first section describes the system and includes the following: analysis worksheet, data sources, utilization, system…
FTOOLS: A FITS Data Processing and Analysis Software Package
NASA Astrophysics Data System (ADS)
Blackburn, J. Kent; Greene, Emily A.; Pence, William
1993-05-01
FTOOLS, a highly modular collection of utilities for processing and analyzing data in the FITS (Flexible Image Transport System) format, has been developed in support of the HEASARC (High Energy Astrophysics Research Archive Center) at NASA's Goddard Space Flight Center. Each utility performs a single simple task such as presentation of file contents, extraction of specific rows or columns, appending or merging tables, binning values in a column or selecting subsets of rows based on a boolean expression. Individual utilities can easily be chained together in scripts to achieve more complex operations such as the generation and displaying of spectra or light curves. The collection of utilities provides both generic processing and analysis utilities and utilities common to high energy astrophysics data sets. The FTOOLS software package is designed to be both compatible with IRAF and completely stand alone in a UNIX or VMS environment. The user interface is controlled by standard IRAF parameter files. The package is self documenting through the IRAF help facility and a stand alone help task. Software is written in ANSI C and FORTRAN to provide portability across most computer systems. The data format dependencies between hardware platforms are isolated through the FITSIO library package.
Modeling of materials supply, demand and prices
NASA Technical Reports Server (NTRS)
1982-01-01
The societal, economic, and policy tradeoffs associated with materials processing and utilization, are discussed. The materials system provides the materials engineer with the system analysis required for formulate sound materials processing, utilization, and resource development policies and strategies. Materials system simulation and modeling research program including assessments of materials substitution dynamics, public policy implications, and materials process economics was expanded. This effort includes several collaborative programs with materials engineers, economists, and policy analysts. The technical and socioeconomic issues of materials recycling, input-output analysis, and technological change and productivity are examined. The major thrust areas in materials systems research are outlined.
NASA Astrophysics Data System (ADS)
Guo, X.; Wu, Z.; Lv, C.
2017-12-01
The water utilization benefits are formed by the material flow, energy flow, information flow and value stream in the whole water cycle process, and reflected along with the material circulation of inner system. But most of traditional water utilization benefits evaluation are based on the macro level, only consider the whole material input and output and energy conversion relation, and lack the characterization of water utilization benefits accompanying with water cycle process from the formation mechanism. In addition, most studies are from the perspective of economics, only pay attention to the whole economic output and sewage treatment economic investment, but neglect the ecological function benefits of water cycle, Therefore, from the perspective of internal material circulation in the whole system, taking water cycle process as the process of material circulation and energy flow, the circulation and flow process of water and other ecological environment, social economic elements were described, and the composition of water utilization positive and negative benefits in water-ecological-economic system was explored, and the performance of each benefit was analyzed. On this basis, the emergy calculation method of each benefit was proposed by emergy quantitative analysis technique, which can realize the unified measurement and evaluation of water utilization benefits in water-ecological-economic system. Then, taking Zhengzhou city as an example, the corresponding benefits of different water cycle links were calculated quantitatively by emergy method, and the results showed that the emergy evaluation method of water utilization benefits can unify the ecosystem and the economic system, achieve uniform quantitative analysis, and measure the true value of natural resources and human economic activities comprehensively.
NASA Astrophysics Data System (ADS)
McNeese, L. E.
1981-01-01
Increased utilization of coal and other fossil fuel alternatives as sources of clean energy is reported. The following topics are discussed: coal conversion development, chemical research and development, materials technology, component development and process evaluation studies, technical support to major liquefaction projects, process analysis and engineering evaluations, fossil energy environmental analysis, flue gas desulfurization, solid waste disposal, coal preparation waste utilization, plant control development, atmospheric fluidized bed coal combustor for cogeneration, TVA FBC demonstration plant program technical support, PFBC systems analysis, fossil fuel applications assessments, performance assurance system support for fossil energy projects, international energy technology assessment, and general equilibrium models of liquid and gaseous fuel supplies.
System review: a method for investigating medical errors in healthcare settings.
Alexander, G L; Stone, T T
2000-01-01
System analysis is a process of evaluating objectives, resources, structure, and design of businesses. System analysis can be used by leaders to collaboratively identify breakthrough opportunities to improve system processes. In healthcare systems, system analysis can be used to review medical errors (system occurrences) that may place patients at risk for injury, disability, and/or death. This study utilizes a case management approach to identify medical errors. Utilizing an interdisciplinary approach, a System Review Team was developed to identify trends in system occurrences, facilitate communication, and enhance the quality of patient care by reducing medical errors.
A research program in magnetogasdynamics utilizing hypervelocity coaxial plasma generators
NASA Technical Reports Server (NTRS)
Spight, C.
1976-01-01
A broadly-gauged research program in magnetogasdynamics utilizing hypervelocity coaxial plasma generators is presented. A complete hypervelocity coaxial plasma generator facility was assembled and tested. Significant progress was made in the direction of understanding the important processes in the interaction of hypervelocity MGD flow with transverse applied fields. It is now proposed to utilize the accumulated experimental capability and theoretical analysis in application to the analysis and design parameterization of pulsed magnetogasdynamic direct energy convertor configurations.
Advances in Proteomics Data Analysis and Display Using an Accurate Mass and Time Tag Approach
Zimmer, Jennifer S.D.; Monroe, Matthew E.; Qian, Wei-Jun; Smith, Richard D.
2007-01-01
Proteomics has recently demonstrated utility in understanding cellular processes on the molecular level as a component of systems biology approaches and for identifying potential biomarkers of various disease states. The large amount of data generated by utilizing high efficiency (e.g., chromatographic) separations coupled to high mass accuracy mass spectrometry for high-throughput proteomics analyses presents challenges related to data processing, analysis, and display. This review focuses on recent advances in nanoLC-FTICR-MS-based proteomics approaches and the accompanying data processing tools that have been developed to display and interpret the large volumes of data being produced. PMID:16429408
1996 State Transportation Improvement Program
DOT National Transportation Integrated Search
1996-01-01
The Wyoming Transportation Department utilizes a continuing and comprehensive procress of Needs Analysis, Priority Rating, Financial Analysis, and Manpower Analysis. This "State Transportation Improvement Program" is the culmination of that process. ...
Goold, S D
1996-01-01
Assuming that rationing health care is unavoidable, and that it requires moral reasoning, how should we allocate limited health care resources? This question is difficult because our pluralistic, liberal society has no consensus on a conception of distributive justice. In this article I focus on an alternative: Who shall decide how to ration health care, and how shall this be done to respect autonomy, pluralism, liberalism, and fairness? I explore three processes for making rationing decisions: cost-utility analysis, informed democratic decision making, and applications of the veil of ignorance. I evaluate these processes as examples of procedural justice, assuming that there is no outcome considered the most just. I use consent as a criterion to judge competing processes so that rationing decisions are, to some extent, self-imposed. I also examine the processes' feasibility in our current health care system. Cost-utility analysis does not meet criteria for actual or presumed consent, even if costs and health-related utility could be measured perfectly. Existing structures of government cannot creditably assimilate the information required for sound rationing decisions, and grassroots efforts are not representative. Applications of the veil of ignorance are more useful for identifying principles relevant to health care rationing than for making concrete rationing decisions. I outline a process of decision making, specifically for health care, that relies on substantive, selected representation, respects pluralism, liberalism, and deliberative democracy, and could be implemented at the community or organizational level.
Examining the Utility of Topic Models for Linguistic Analysis of Couple Therapy
ERIC Educational Resources Information Center
Doeden, Michelle A.
2012-01-01
This study examined the basic utility of topic models, a computational linguistics model for text-based data, to the investigation of the process of couple therapy. Linguistic analysis offers an additional lens through which to examine clinical data, and the topic model is presented as a novel methodology within couple and family psychology that…
Visually enhanced CCTV digital surveillance utilizing Intranet and Internet.
Ozaki, Nobuyuki
2002-07-01
This paper describes a solution for integrated plant supervision utilizing closed circuit television (CCTV) digital surveillance. Three basic requirements are first addressed as the platform of the system, with discussion on the suitable video compression. The system configuration is described in blocks. The system provides surveillance functionality: real-time monitoring, and process analysis functionality: a troubleshooting tool. This paper describes the formulation of practical performance design for determining various encoder parameters. It also introduces image processing techniques for enhancing the original CCTV digital image to lessen the burden on operators. Some screenshots are listed for the surveillance functionality. For the process analysis, an image searching filter supported by image processing techniques is explained with screenshots. Multimedia surveillance, which is the merger with process data surveillance, or the SCADA system, is also explained.
Sensitivity analysis of the add-on price estimate for the edge-defined film-fed growth process
NASA Technical Reports Server (NTRS)
Mokashi, A. R.; Kachare, A. H.
1981-01-01
The analysis is in terms of cost parameters and production parameters. The cost parameters include equipment, space, direct labor, materials, and utilities. The production parameters include growth rate, process yield, and duty cycle. A computer program was developed specifically to do the sensitivity analysis.
Preliminary System Analysis of In Situ Resource Utilization for Mars Human Exploration
NASA Technical Reports Server (NTRS)
Rapp, Donald; Andringa, Jason; Easter, Robert; Smith, Jeffrey H .; Wilson, Thomas; Clark, D. Larry; Payne, Kevin
2005-01-01
We carried out a system analysis of processes for utilization of Mars resources to support human exploration of Mars by production of propellants from indigenous resources. Seven ISRU processes were analyzed to determine mass. power and propellant storage volume requirements. The major elements of each process include C02 acquisition, chemical conversion, and storage of propellants. Based on a figure of merit (the ratio of the mass of propellants that must be brought from Earth in a non-ISRU mission to the mass of the ISRU system. tanks and feedstocks that must be brought from Earth for a ISRU mission) the most attractive process (by far); is one where indigenous Mars water is accessible and this is processed via Sabatier/Electrolysis to methane and oxygen. These processes are technically relatively mature. Other processes with positive leverage involve reverse water gas shift and solid oxide electrolysis.
Yu, Yuncui; Jia, Lulu; Meng, Yao; Hu, Lihua; Liu, Yiwei; Nie, Xiaolu; Zhang, Meng; Zhang, Xuan; Han, Sheng; Peng, Xiaoxia; Wang, Xiaoling
2018-04-01
Establishing a comprehensive clinical evaluation system is critical in enacting national drug policy and promoting rational drug use. In China, the 'Clinical Comprehensive Evaluation System for Pediatric Drugs' (CCES-P) project, which aims to compare drugs based on clinical efficacy and cost effectiveness to help decision makers, was recently proposed; therefore, a systematic and objective method is required to guide the process. An evidence-based multi-criteria decision analysis model that involved an analytic hierarchy process (AHP) was developed, consisting of nine steps: (1) select the drugs to be reviewed; (2) establish the evaluation criterion system; (3) determine the criterion weight based on the AHP; (4) construct the evidence body for each drug under evaluation; (5) select comparative measures and calculate the original utility score; (6) place a common utility scale and calculate the standardized utility score; (7) calculate the comprehensive utility score; (8) rank the drugs; and (9) perform a sensitivity analysis. The model was applied to the evaluation of three different inhaled corticosteroids (ICSs) used for asthma management in children (a total of 16 drugs with different dosage forms and strengths or different manufacturers). By applying the drug analysis model, the 16 ICSs under review were successfully scored and evaluated. Budesonide suspension for inhalation (drug ID number: 7) ranked the highest, with comprehensive utility score of 80.23, followed by fluticasone propionate inhaled aerosol (drug ID number: 16), with a score of 79.59, and budesonide inhalation powder (drug ID number: 6), with a score of 78.98. In the sensitivity analysis, the ranking of the top five and lowest five drugs remains unchanged, suggesting this model is generally robust. An evidence-based drug evaluation model based on AHP was successfully developed. The model incorporates sufficient utility and flexibility for aiding the decision-making process, and can be a useful tool for the CCES-P.
Using Operational Analysis to Improve Access to Pulmonary Function Testing.
Ip, Ada; Asamoah-Barnieh, Raymond; Bischak, Diane P; Davidson, Warren J; Flemons, W Ward; Pendharkar, Sachin R
2016-01-01
Background. Timely pulmonary function testing is crucial to improving diagnosis and treatment of pulmonary diseases. Perceptions of poor access at an academic pulmonary function laboratory prompted analysis of system demand and capacity to identify factors contributing to poor access. Methods. Surveys and interviews identified stakeholder perspectives on operational processes and access challenges. Retrospective data on testing demand and resource capacity was analyzed to understand utilization of testing resources. Results. Qualitative analysis demonstrated that stakeholder groups had discrepant views on access and capacity in the laboratory. Mean daily resource utilization was 0.64 (SD 0.15), with monthly average utilization consistently less than 0.75. Reserved testing slots for subspecialty clinics were poorly utilized, leaving many testing slots unfilled. When subspecialty demand exceeded number of reserved slots, there was sufficient capacity in the pulmonary function schedule to accommodate added demand. Findings were shared with stakeholders and influenced scheduling process improvements. Conclusion. This study highlights the importance of operational data to identify causes of poor access, guide system decision-making, and determine effects of improvement initiatives in a variety of healthcare settings. Importantly, simple operational analysis can help to improve efficiency of health systems with little or no added financial investment.
Acoustics based assessment of respiratory diseases using GMM classification.
Mayorga, P; Druzgalski, C; Morelos, R L; Gonzalez, O H; Vidales, J
2010-01-01
The focus of this paper is to present a method utilizing lung sounds for a quantitative assessment of patient health as it relates to respiratory disorders. In order to accomplish this, applicable traditional techniques within the speech processing domain were utilized to evaluate lung sounds obtained with a digital stethoscope. Traditional methods utilized in the evaluation of asthma involve auscultation and spirometry, but utilization of more sensitive electronic stethoscopes, which are currently available, and application of quantitative signal analysis methods offer opportunities of improved diagnosis. In particular we propose an acoustic evaluation methodology based on the Gaussian Mixed Models (GMM) which should assist in broader analysis, identification, and diagnosis of asthma based on the frequency domain analysis of wheezing and crackles.
Sandia Engineering Analysis Code Access System v. 2.0.1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sjaardema, Gregory D.
The Sandia Engineering Analysis Code Access System (SEACAS) is a suite of preprocessing, post processing, translation, visualization, and utility applications supporting finite element analysis software using the Exodus database file format.
Angiuoli, Samuel V; Matalka, Malcolm; Gussman, Aaron; Galens, Kevin; Vangala, Mahesh; Riley, David R; Arze, Cesar; White, James R; White, Owen; Fricke, W Florian
2011-08-30
Next-generation sequencing technologies have decentralized sequence acquisition, increasing the demand for new bioinformatics tools that are easy to use, portable across multiple platforms, and scalable for high-throughput applications. Cloud computing platforms provide on-demand access to computing infrastructure over the Internet and can be used in combination with custom built virtual machines to distribute pre-packaged with pre-configured software. We describe the Cloud Virtual Resource, CloVR, a new desktop application for push-button automated sequence analysis that can utilize cloud computing resources. CloVR is implemented as a single portable virtual machine (VM) that provides several automated analysis pipelines for microbial genomics, including 16S, whole genome and metagenome sequence analysis. The CloVR VM runs on a personal computer, utilizes local computer resources and requires minimal installation, addressing key challenges in deploying bioinformatics workflows. In addition CloVR supports use of remote cloud computing resources to improve performance for large-scale sequence processing. In a case study, we demonstrate the use of CloVR to automatically process next-generation sequencing data on multiple cloud computing platforms. The CloVR VM and associated architecture lowers the barrier of entry for utilizing complex analysis protocols on both local single- and multi-core computers and cloud systems for high throughput data processing.
geospatial data analysis using parallel processing High performance computing Renewable resource technical potential and supply curve analysis Spatial database utilization Rapid analysis of large geospatial datasets energy and geospatial analysis products Research Interests Rapid, web-based renewable resource analysis
NASA Astrophysics Data System (ADS)
Drieniková, Katarína; Hrdinová, Gabriela; Naňo, Tomáš; Sakál, Peter
2010-01-01
The paper deals with the analysis of the theory of corporate social responsibility, risk management and the exact method of analytic hierarchic process that is used in the decision-making processes. The Chapters 2 and 3 focus on presentation of the experience with the application of the method in formulating the stakeholders' strategic goals within the Corporate Social Responsibility (CSR) and simultaneously its utilization in minimizing the environmental risks. The major benefit of this paper is the application of Analytical Hierarchy Process (AHP).
Health state utilities: a framework for studying the gap between the imagined and the real.
Stiggelbout, Anne M; de Vogel-Voogt, Elsbeth
2008-01-01
Health state utilities play an important role in decision analysis and cost-utility analysis. The question whose utilities to use at various levels of health-care decision-making has been subject of considerable debate. The observation that patients often value their own health, but also other health states, higher than members of the general public raises the question what underlies such differences? Is it an artifact of the valuation methods? Is it adaptation versus poor anticipated adaptation? This article describes a framework for the understanding and study of potential mechanisms that play a role in health state valuation. It aims at connecting research from within different fields so that cross-fertilization of ideas may occur. The framework is based on stimulus response models from social judgment theory. For each phase, from stimulus, through information interpretation and integration, to judgment, and, finally, to response, we provide evidence of factors and processes that may lead to different utilities in patients and healthy subjects. Examples of factors and processes described are the lack of scope of scenarios in the stimulus phase, and appraisal processes and framing effects in the information interpretation phase. Factors that play a role in the judgment phase are, for example, heuristics and biases, adaptation, and comparison processes. Some mechanisms related to the response phase are end aversion bias, probability distortion, and noncompensatory decision-making. The framework serves to explain many of the differences in valuations between respondent groups. We discuss some of the findings as they relate to the field of response shift research. We propose issues for discussion in the field, and suggestions for improvement of the process of utility assessment.
Helping Water Utilities Grapple with Climate Change
NASA Astrophysics Data System (ADS)
Yates, D.; Gracely, B.; Miller, K.
2008-12-01
The Water Research Foundation (WRF), serving the drinking water industry and the National Center for Atmospheric Research (NCAR) are collaborating on an effort to develop and implement locally-relevant, structured processes to help water utilities consider the impacts and adaptation options that climate variability and change might have on their water systems. Adopting a case-study approach, the structured process include 1) a problem definition phase, focused on identifying goals, information needs, utility vulnerabilities and possible adaptation options in the face of climate and hydrologic uncertainty; 2) developing and/or modifying system-specific Integrated Water Resource Management (IWRM) models and conducting sensitivity analysis to identify critical variables; 3) developing probabilistic climate change scenarios focused on exploring uncertainties identified as important in the sensitivity analysis in step 2; and 4) implementing the structured process and examining approaches decision making under uncertainty. Collaborators include seven drinking water utilities and two state agencies: 1) The Inland Empire Utility Agency, CA; 2) The El Dorado Irrigation District, Placerville CA; 2) Portland Water Bureau, Portland OR; 3) Colorado Springs Utilities, Colo Spgs, CO; 4) Cincinnati Water, Cincinnati, OH; 5) Massachusetts Water Resources Authority (MWRA), Boston, MA; 6) Durham Water, Durham, NC; and 7) Palm Beach County Water (PBCW), Palm Beach, FL. The California Department of Water Resources and the Colorado Water Conservation Board were the state agencies that we have collaborated with.
What Do HPT Consultants Do for Performance Analysis?
ERIC Educational Resources Information Center
Kang, Sung
2017-01-01
This study was conducted to contribute to the field of Human Performance Technology (HPT) through the validation of the performance analysis process of the International Society for Performance Improvement (ISPI) HPT model, the most representative and frequently utilized process model in the HPT field. The study was conducted using content…
50 CFR 37.53 - Submission of data and information.
Code of Federal Regulations, 2011 CFR
2011-10-01
... processing. (c) Processed geophysical information shall be submitted with extraneous signals and interference... of data gathering or utilization, i.e., acquisition, processing, reprocessing, analysis, and... survey conducted under the permittee's permit, including digital navigational data, if obtained, and...
50 CFR 37.53 - Submission of data and information.
Code of Federal Regulations, 2010 CFR
2010-10-01
... processing. (c) Processed geophysical information shall be submitted with extraneous signals and interference... of data gathering or utilization, i.e., acquisition, processing, reprocessing, analysis, and... survey conducted under the permittee's permit, including digital navigational data, if obtained, and...
50 CFR 37.53 - Submission of data and information.
Code of Federal Regulations, 2012 CFR
2012-10-01
... processing. (c) Processed geophysical information shall be submitted with extraneous signals and interference... of data gathering or utilization, i.e., acquisition, processing, reprocessing, analysis, and... survey conducted under the permittee's permit, including digital navigational data, if obtained, and...
50 CFR 37.53 - Submission of data and information.
Code of Federal Regulations, 2014 CFR
2014-10-01
... processing. (c) Processed geophysical information shall be submitted with extraneous signals and interference... of data gathering or utilization, i.e., acquisition, processing, reprocessing, analysis, and... survey conducted under the permittee's permit, including digital navigational data, if obtained, and...
50 CFR 37.53 - Submission of data and information.
Code of Federal Regulations, 2013 CFR
2013-10-01
... processing. (c) Processed geophysical information shall be submitted with extraneous signals and interference... of data gathering or utilization, i.e., acquisition, processing, reprocessing, analysis, and... survey conducted under the permittee's permit, including digital navigational data, if obtained, and...
ERIC Educational Resources Information Center
Dickstein, Gary G.
2011-01-01
This study contributes to the research regarding processes and procedures utilized by two institutions of higher education to respond to students who participate in inappropriate behavior and who are concomitantly experiencing a mental health crisis. A case study analysis of two institutions of higher education was used to examine this issue. The…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Humme, J.T.; Tanaka, M.T.; Yokota, M.H.
1979-07-01
The purpose of this study was to determine the feasibility of geothermal resource utilization at the Puna Sugar Company cane sugar processing plant, located in Keaau, Hawaii. A proposed well site area was selected based on data from surface exploratory surveys. The liquid dominated well flow enters a binary thermal arrangement, which results in an acceptable quality steam for process use. Hydrogen sulfide in the well gases is incinerated, leaving sulfur dioxide in the waste gases. The sulfur dioxide in turn is recovered and used in the cane juice processing at the sugar factory. The clean geothermal steam from themore » binary system can be used directly for process requirements. It replaces steam generated by the firing of the waste fibrous product from cane sugar processing. The waste product, called bagasse, has a number of alternative uses, but an evaluation clearly indicated it should continue to be employed for steam generation. This steam, no longer required for process demands, can be directed to increased electric power generation. Revenues gained by the sale of this power to the utility, in addition to other savings developed through the utilization of geothermal energy, can offset the costs associated with hydrothermal utilization.« less
Evaluation of a Stirling Solar Dynamic System for Lunar Oxygen Production
NASA Technical Reports Server (NTRS)
Colozza, Anthony J.; Wong, Wayne A.
2006-01-01
An evaluation of a solar concentrator-based system for producing oxygen from the lunar regolith was performed. The system utilizes a solar concentrator mirror to provide thermal energy for the oxygen production process as well as thermal energy to power a Stirling heat engine for the production of electricity. The electricity produced is utilized to operate the equipment needed in the oxygen production process. The oxygen production method utilized in the analysis was the hydrogen reduction of ilmenite. Utilizing this method of oxygen production a baseline system design was produced. This baseline system had an oxygen production rate of 0.6 kg/hr with a concentrator mirror size of 5 m. Variations were performed on the baseline design to show how changes in the system size and process rate effected the oxygen production rate.
Bioleaching of multiple metals from contaminated sediment by moderate thermophiles.
Gan, Min; Jie, Shiqi; Li, Mingming; Zhu, Jianyu; Liu, Xinxing
2015-08-15
A moderately thermophilic consortium was applied in bioleaching multiple metals from contaminated sediment. The consortium got higher acidification and metals soubilization efficiency than that of the pure strains. The synergistic effect of the thermophilic consortium accelerated substrates utilization. The utilization of substrate started with sulfur in the early stage, and then the pH declined, giving rise to making use of the pyrite. Community dynamic showed that A. caldus was the predominant bacteria during the whole bioleaching process while the abundance of S. thermotolerans increased together with pyrite utilization. Solubilization efficiency of Zn, Cu, Mn and Cd reached 98%, 94%, 95%, and 89% respectively, while As, Hg, Pb was only 45%, 34%, 22%. Logistic model was used to simulate the bioleaching process, whose fitting degree was higher than 90%. Correlation analysis revealed that metal leaching was mainly an acid solubilization process. Fraction analysis revealed that metals decreased in mobility and bioavailability. Copyright © 2015 Elsevier Ltd. All rights reserved.
Yilmaz, Vedat; Ince-Yilmaz, Ebru; Yilmazel, Yasemin Dilsad; Duran, Metin
2014-06-01
In this study, biomass samples were obtained from six municipal and nine industrial full-scale anaerobic processes to investigate whether the aceticlastic methanogen population composition is related to acetate utilization capacity and the nature of the wastewater treated, i.e. municipal sludge or industrial wastewater. Batch serum bottle tests were used to determine the specific acetate utilization rate (AUR), and a quantitative real-time polymerase chain reaction protocol was used to enumerate the acetate-utilizing Methanosaeta and Methanosarcina populations in the biomass samples. Methanosaeta was the dominant aceticlastic methanogen in all samples, except for one industrial wastewater-treating anaerobic process. However, Methanosarcina density in industrial biomass samples was higher than the Methanosarcina density in the municipal samples. The average AUR values of municipal and industrial wastewater treatment plant biomass samples were 10.49 and 10.65 mg CH3COO(-)/log(aceticlastic methanogen gene copy).d, respectively. One-way ANOVA test and principle component analysis showed that the acetate utilization capacities and aceticlastic methanogen community composition did not show statistically significant correlation among the municipal digesters and industrial wastewater-treating processes investigated.
NursesforTomorrow: a proactive approach to nursing resource analysis.
Bournes, Debra A; Plummer, Carolyn; Miller, Robert; Ferguson-Paré, Mary
2010-03-01
This paper describes the background, development, implementation and utilization of NursesforTomorrow (N4T), a practical and comprehensive nursing human resources analysis method to capture regional, institutional and patient care unit-specific actual and predicted nurse vacancies, nurse staff characteristics and nurse staffing changes. Reports generated from the process include forecasted shortfalls or surpluses of nurses, percentage of novice nurses, occupancy, sick time, overtime, agency use and other metrics. Readers will benefit from a description of the ways in which the data generated from the nursing resource analysis process are utilized at senior leadership, program and unit levels to support proactive hiring and resource allocation decisions and to predict unit-specific recruitment and retention patterns across multiple healthcare organizations and regions.
NASA Astrophysics Data System (ADS)
Jayamani, E.; Perera, D. S.; Soon, K. H.; Bakri, M. K. B.
2017-04-01
A systematic method of material analysis aiming for fuel efficiency improvement with the utilization of natural fiber reinforced polymer matrix composites in the automobile industry is proposed. A multi-factor based decision criteria with Analytical Hierarchy Process (AHP) was used and executed through MATLAB to achieve improved fuel efficiency through the weight reduction of vehicular components by effective comparison between two engine hood designs. The reduction was simulated by utilizing natural fiber polymer composites with thermoplastic polypropylene (PP) as the matrix polymer and benchmarked against a synthetic based composite component. Results showed that PP with 35% of flax fiber loading achieved a 0.4% improvement in fuel efficiency, and it was the highest among the 27 candidate fibers.
Modeling Choice Under Uncertainty in Military Systems Analysis
1991-11-01
operators rather than fuzzy operators. This is suggested for further research. 4.3 ANALYTIC HIERARCHICAL PROCESS ( AHP ) In AHP , objectives, functions and...14 4.1 IMPRECISELY SPECIFIED MULTIPLE A’ITRIBUTE UTILITY THEORY... 14 4.2 FUZZY DECISION ANALYSIS...14 4.3 ANALYTIC HIERARCHICAL PROCESS ( AHP ) ................................... 14 4.4 SUBJECTIVE TRANSFER FUNCTION APPROACH
Teaching Special Education Teachers How to Conduct Functional Analysis in Natural Settings
ERIC Educational Resources Information Center
Erbas, Dilek; Tekin-Iftar, Elif; Yucesoy, Serife
2006-01-01
Effects of a training program utilized to teach how to conduct functional analysis process to teachers of children with developmental disabilities was examined. Furthermore, teachers' opinions regarding this process were investigated. A multiple probe design across subjects with probe conditions was used. Teacher training was in two phases. In the…
On-road anomaly detection by multimodal sensor analysis and multimedia processing
NASA Astrophysics Data System (ADS)
Orhan, Fatih; Eren, P. E.
2014-03-01
The use of smartphones in Intelligent Transportation Systems is gaining popularity, yet many challenges exist in developing functional applications. Due to the dynamic nature of transportation, vehicular social applications face complexities such as developing robust sensor management, performing signal and image processing tasks, and sharing information among users. This study utilizes a multimodal sensor analysis framework which enables the analysis of sensors in multimodal aspect. It also provides plugin-based analyzing interfaces to develop sensor and image processing based applications, and connects its users via a centralized application as well as to social networks to facilitate communication and socialization. With the usage of this framework, an on-road anomaly detector is being developed and tested. The detector utilizes the sensors of a mobile device and is able to identify anomalies such as hard brake, pothole crossing, and speed bump crossing. Upon such detection, the video portion containing the anomaly is automatically extracted in order to enable further image processing analysis. The detection results are shared on a central portal application for online traffic condition monitoring.
2011-01-01
Background Next-generation sequencing technologies have decentralized sequence acquisition, increasing the demand for new bioinformatics tools that are easy to use, portable across multiple platforms, and scalable for high-throughput applications. Cloud computing platforms provide on-demand access to computing infrastructure over the Internet and can be used in combination with custom built virtual machines to distribute pre-packaged with pre-configured software. Results We describe the Cloud Virtual Resource, CloVR, a new desktop application for push-button automated sequence analysis that can utilize cloud computing resources. CloVR is implemented as a single portable virtual machine (VM) that provides several automated analysis pipelines for microbial genomics, including 16S, whole genome and metagenome sequence analysis. The CloVR VM runs on a personal computer, utilizes local computer resources and requires minimal installation, addressing key challenges in deploying bioinformatics workflows. In addition CloVR supports use of remote cloud computing resources to improve performance for large-scale sequence processing. In a case study, we demonstrate the use of CloVR to automatically process next-generation sequencing data on multiple cloud computing platforms. Conclusion The CloVR VM and associated architecture lowers the barrier of entry for utilizing complex analysis protocols on both local single- and multi-core computers and cloud systems for high throughput data processing. PMID:21878105
Counter-Learning under Oppression
ERIC Educational Resources Information Center
Kucukaydin, Ilhan
2010-01-01
This qualitative study utilized the method of narrative analysis to explore the counter-learning process of an oppressed Kurdish woman from Turkey. Critical constructivism was utilized to analyze counter-learning; Frankfurt School-based Marcusian critical theory was used to analyze the sociopolitical context and its impact on the oppressed. Key…
Instructional television utilization in the United States
NASA Technical Reports Server (NTRS)
Dumolin, J. R.
1971-01-01
Various aspects of utilizing instructional television (ITV) are summarized and evaluated and basic guidelines for future utilization of television as an instructional medium in education are considered. The role of technology in education, capabilities and limitations of television as an instructional media system and the state of ITV research efforts are discussed. Examples of various ongoing ITV programs are given and summarized. The problems involved in the three stages of the ITV process (production, distribution, and classroom utilization) are presented. A summary analysis outlines probable trends in future utilization.
Old River Control Complex Sedimentation Investigation
2015-06-01
efforts to describe the shoaling processes and sediment transport in the two-river system. Geomorphic analysis The geomorphic assessment utilized...District, New Orleans. The investigation was conducted via a combination of field data collection and laboratory analysis, geomorphic assessments, and...6 Geomorphic analysis
Roysden, Nathaniel; Wright, Adam
2015-01-01
Mental health problems are an independent predictor of increased healthcare utilization. We created random forest classifiers for predicting two outcomes following a patient's first behavioral health encounter: decreased utilization by any amount (AUROC 0.74) and ultra-high absolute utilization (AUROC 0.88). These models may be used for clinical decision support by referring providers, to automatically detect patients who may benefit from referral, for cost management, or for risk/protection factor analysis.
Feasibilities of a Coal-Biomass to Liquids Plant in Southern West Virginia
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bhattacharyya, Debangsu; DVallance, David; Henthorn, Greg
This project has generated comprehensive and realistic results of feasibilities for a coal-biomass to liquids (CBTL) plant in southern West Virginia; and evaluated the sensitivity of the analyses to various anticipated scenarios and parametric uncertainties. Specifically the project has addressed economic feasibility, technical feasibility, market feasibility, and financial feasibility. In the economic feasibility study, a multi-objective siting model was developed and was then used to identify and rank the suitable facility sites. Spatial models were also developed to assess the biomass and coal feedstock availabilities and economics. Environmental impact analysis was conducted mainly to assess life cycle analysis and greenhousemore » gas emission. Uncertainty and sensitivity analysis were also investigated in this study. Sensitivity analyses on required selling price (RSP) and greenhouse gas (GHG) emissions of CBTL fuels were conducted according to feedstock availability and price, biomass to coal mix ratio, conversion rate, internal rate of return (IRR), capital cost, operational and maintenance cost. The study of siting and capacity showed that feedstock mixed ratio limited the CBTL production. The price of coal had a more dominant effect on RSP than that of biomass. Different mix ratios in the feedstock and conversion rates led to RSP ranging from $104.3 - $157.9/bbl. LCA results indicated that GHG emissions ranged from 80.62 kg CO 2 eq to 101.46 kg CO2 eq/1,000 MJ of liquid fuel at various biomass to coal mix ratios and conversion rates if carbon capture and storage (CCS) was applied. Most of water and fossil energy were consumed in conversion process. Compared to petroleum-derived-liquid fuels, the reduction in GHG emissions could be between -2.7% and 16.2% with CBTL substitution. As for the technical study, three approaches of coal and biomass to liquids, direct, indirect and hybrid, were considered in the analysis. The process models including conceptual design, process modeling and process validation were developed and validated for different cases. Equipment design and capital costs were investigated on capital coast estimation and economical model validation. Material and energy balances and techno-economic analysis on base case were conducted for evaluation of projects. Also, sensitives studies of direct and indirect approaches were both used to evaluate the CBTL plant economic performance. In this study, techno-economic analysis were conducted in Aspen Process Economic Analyzer (APEA) environment for indirect, direct, and hybrid CBTL plants with CCS based on high fidelity process models developed in Aspen Plus and Excel. The process thermal efficiency ranges from 45% to 67%. The break-even oil price ranges from $86.1 to $100.6 per barrel for small scale (10000 bbl/day) CBTL plants and from $65.3 to $80.5 per barrel for large scale (50000 bbl/day) CBTL plants. Increasing biomass/coal ratio from 8/92 to 20/80 would increase the break-even oil price of indirect CBTL plant by $3/bbl and decrease the break-even oil price of direct CBTL plant by about $1/bbl. The order of carbon capture penalty is direct > indirect > hybrid. The order of capital investment is hybrid (with or without shale gas utilization) > direct (without shale gas utilization) > indirect > direct (with shale gas utilization). The order of thermal efficiency is direct > hybrid > indirect. The order of break-even oil price is hybrid (without shale gas utilization) > direct (without shale gas utilization) > hybrid (with shale gas utilization) > indirect > direct (with shale gas utilization).« less
Characterization of Neurofibromas of the Skin and Spinal Roots in a Mouse Model
2011-02-01
renewal program of stem/progenitor cells can cause tumorigenesis. By utilizing genetically engineered mouse models of neurofibromatosis type 1 (NF1...pathetic ganglia and adrenal medulla and died at birth (Gitler et al., 2003). To circumvent early lethality of the Nf1NC mice, we utilized a previously...Supplemental experimental procedures Tissue Processing For histological analysis, we utilized both paraffin sections and frozen sections. For both
Graphics Processing Unit Assisted Thermographic Compositing
NASA Technical Reports Server (NTRS)
Ragasa, Scott; Russell, Samuel S.
2012-01-01
Objective Develop a software application utilizing high performance computing techniques, including general purpose graphics processing units (GPGPUs), for the analysis and visualization of large thermographic data sets. Over the past several years, an increasing effort among scientists and engineers to utilize graphics processing units (GPUs) in a more general purpose fashion is allowing for previously unobtainable levels of computation by individual workstations. As data sets grow, the methods to work them grow at an equal, and often greater, pace. Certain common computations can take advantage of the massively parallel and optimized hardware constructs of the GPU which yield significant increases in performance. These common computations have high degrees of data parallelism, that is, they are the same computation applied to a large set of data where the result does not depend on other data elements. Image processing is one area were GPUs are being used to greatly increase the performance of certain analysis and visualization techniques.
Hawkins, H; Langer, J; Padua, E; Reaves, J
2001-06-01
Activity-based costing (ABC) is a process that enables the estimation of the cost of producing a product or service. More accurate than traditional charge-based approaches, it emphasizes analysis of processes, and more specific identification of both direct and indirect costs. This accuracy is essential in today's healthcare environment, in which managed care organizations necessitate responsible and accountable costing. However, to be successfully utilized, it requires time, effort, expertise, and support. Data collection can be tedious and expensive. By integrating ABC with information management (IM) and systems (IS), organizations can take advantage of the process orientation of both, extend and improve ABC, and decrease resource utilization for ABC projects. In our case study, we have examined the process of a multidisciplinary breast center. We have mapped the constituent activities and established cost drivers. This information has been structured and included in our information system database for subsequent analysis.
Automated Student Aid Processing: The Challenge and Opportunity.
ERIC Educational Resources Information Center
St. John, Edward P.
1985-01-01
To utilize automated technology for student aid processing, it is necessary to work with multi-institutional offices (student aid, admissions, registration, and business) and to develop automated interfaces with external processing systems at state and federal agencies and perhaps at need-analysis organizations and lenders. (MLW)
Kagan, Jonathan M; Rosas, Scott; Trochim, William M K
2010-10-01
New discoveries in basic science are creating extraordinary opportunities to design novel biomedical preventions and therapeutics for human disease. But the clinical evaluation of these new interventions is, in many instances, being hindered by a variety of legal, regulatory, policy and operational factors, few of which enhance research quality, the safety of study participants or research ethics. With the goal of helping increase the efficiency and effectiveness of clinical research, we have examined how the integration of utilization-focused evaluation with elements of business process modeling can reveal opportunities for systematic improvements in clinical research. Using data from the NIH global HIV/AIDS clinical trials networks, we analyzed the absolute and relative times required to traverse defined phases associated with specific activities within the clinical protocol lifecycle. Using simple median duration and Kaplan-Meyer survival analysis, we show how such time-based analyses can provide a rationale for the prioritization of research process analysis and re-engineering, as well as a means for statistically assessing the impact of policy modifications, resource utilization, re-engineered processes and best practices. Successfully applied, this approach can help researchers be more efficient in capitalizing on new science to speed the development of improved interventions for human disease.
NASA Astrophysics Data System (ADS)
Becker, T.; König, G.
2015-10-01
Cartographic visualizations of crises are used to create a Common Operational Picture (COP) and enforce Situational Awareness by presenting relevant information to the involved actors. As nearly all crises affect geospatial entities, geo-data representations have to support location-specific analysis throughout the decision-making process. Meaningful cartographic presentation is needed for coordinating the activities of crisis manager in a highly dynamic situation, since operators' attention span and their spatial memories are limiting factors during the perception and interpretation process. Situational Awareness of operators in conjunction with a COP are key aspects in decision-making process and essential for making well thought-out and appropriate decisions. Considering utility networks as one of the most complex and particularly frequent required systems in urban environment, meaningful cartographic presentation of multiple utility networks with respect to disaster management do not exist. Therefore, an optimized visualization of utility infrastructure for emergency response procedures is proposed. The article will describe a conceptual approach on how to simplify, aggregate, and visualize multiple utility networks and their components to meet the requirements of the decision-making process and to support Situational Awareness.
2005-06-01
cognitive task analysis , organizational information dissemination and interaction, systems engineering, collaboration and communications processes, decision-making processes, and data collection and organization. By blending these diverse disciplines command centers can be designed to support decision-making, cognitive analysis, information technology, and the human factors engineering aspects of Command and Control (C2). This model can then be used as a baseline when dealing with work in areas of business processes, workflow engineering, information management,
Development of Low-cost, High Energy-per-unit-area Solar Cell Modules
NASA Technical Reports Server (NTRS)
Jones, G. T.; Chitre, S.; Rhee, S. S.
1978-01-01
The development of two hexagonal solar cell process sequences, a laserscribing process technique for scribing hexagonal and modified hexagonal solar cells, a large through-put diffusion process, and two surface macrostructure processes suitable for large scale production is reported. Experimental analysis was made on automated spin-on anti-reflective coating equipment and high pressure wafer cleaning equipment. Six hexagonal solar cell modules were fabricated. Also covered is a detailed theoretical analysis on the optimum silicon utilization by modified hexagonal solar cells.
Contractor relationships and inter-organizational strategies in NASA's R and D acquisition process
NASA Technical Reports Server (NTRS)
Guiltinan, J.
1976-01-01
Interorganizational analysis of NASA's acquisition process for research and development systems is discussed. The importance of understanding the contractor environment, constraints, and motives in selecting an acquisition strategy is demonstrated. By articulating clear project goals, by utilizing information about the contractor and his needs at each stage in the acquisition process, and by thorough analysis of the inter-organizational relationship, improved selection of acquisition strategies and business practices is possible.
Research on probabilistic information processing
NASA Technical Reports Server (NTRS)
Edwards, W.
1973-01-01
The work accomplished on probabilistic information processing (PIP) is reported. The research proposals and decision analysis are discussed along with the results of research on MSC setting, multiattribute utilities, and Bayesian research. Abstracts of reports concerning the PIP research are included.
NASA Technical Reports Server (NTRS)
Parrish, R. S.; Carter, M. C.
1974-01-01
This analysis utilizes computer simulation and statistical estimation. Realizations of stationary gaussian stochastic processes with selected autocorrelation functions are computer simulated. Analysis of the simulated data revealed that the mean and the variance of a process were functionally dependent upon the autocorrelation parameter and crossing level. Using predicted values for the mean and standard deviation, by the method of moments, the distribution parameters was estimated. Thus, given the autocorrelation parameter, crossing level, mean, and standard deviation of a process, the probability of exceeding the crossing level for a particular length of time was calculated.
Top-level modeling of an als system utilizing object-oriented techniques
NASA Astrophysics Data System (ADS)
Rodriguez, L. F.; Kang, S.; Ting, K. C.
The possible configuration of an Advanced Life Support (ALS) System capable of supporting human life for long-term space missions continues to evolve as researchers investigate potential technologies and configurations. To facilitate the decision process the development of acceptable, flexible, and dynamic mathematical computer modeling tools capable of system level analysis is desirable. Object-oriented techniques have been adopted to develop a dynamic top-level model of an ALS system.This approach has several advantages; among these, object-oriented abstractions of systems are inherently modular in architecture. Thus, models can initially be somewhat simplistic, while allowing for adjustments and improvements. In addition, by coding the model in Java, the model can be implemented via the World Wide Web, greatly encouraging the utilization of the model. Systems analysis is further enabled with the utilization of a readily available backend database containing information supporting the model. The subsystem models of the ALS system model include Crew, Biomass Production, Waste Processing and Resource Recovery, Food Processing and Nutrition, and the Interconnecting Space. Each subsystem model and an overall model have been developed. Presented here is the procedure utilized to develop the modeling tool, the vision of the modeling tool, and the current focus for each of the subsystem models.
Comparing digital data processing techniques for surface mine and reclamation monitoring
NASA Technical Reports Server (NTRS)
Witt, R. G.; Bly, B. G.; Campbell, W. J.; Bloemer, H. H. L.; Brumfield, J. O.
1982-01-01
The results of three techniques used for processing Landsat digital data are compared for their utility in delineating areas of surface mining and subsequent reclamation. An unsupervised clustering algorithm (ISOCLS), a maximum-likelihood classifier (CLASFY), and a hybrid approach utilizing canonical analysis (ISOCLS/KLTRANS/ISOCLS) were compared by means of a detailed accuracy assessment with aerial photography at NASA's Goddard Space Flight Center. Results show that the hybrid approach was superior to the traditional techniques in distinguishing strip mined and reclaimed areas.
Integration of sustainability into process simulaton of a dairy process
USDA-ARS?s Scientific Manuscript database
Life cycle analysis, a method used to quantify the energy and environmental flows of a process or product on the environment, is increasingly utilized by food processors to develop strategies to lessen the carbon footprint of their operations. In the case of the milk supply chain, the method requir...
ERIC Educational Resources Information Center
National Bureau of Standards (DOC), Washington, DC.
These guidelines provide a handbook for use by federal organizations in structuring physical security and risk management programs for their automatic data processing facilities. This publication discusses security analysis, natural disasters, supporting utilities, system reliability, procedural measures and controls, off-site facilities,…
Using Simulation Module, PCLAB, for Steady State Disturbance Sensitivity Analysis in Process Control
ERIC Educational Resources Information Center
Ali, Emad; Idriss, Arimiyawo
2009-01-01
Recently, chemical engineering education moves towards utilizing simulation soft wares to enhance the learning process especially in the field of process control. These training simulators provide interactive learning through visualization and practicing which will bridge the gap between the theoretical abstraction of textbooks and the…
Using Visualization and Computation in the Analysis of Separation Processes
ERIC Educational Resources Information Center
Joo, Yong Lak; Choudhary, Devashish
2006-01-01
For decades, every chemical engineer has been asked to have a background in separations. The required separations course can, however, be uninspiring and superficial because understanding many separation processes involves conventional graphical methods and commercial process simulators. We utilize simple, user-friendly mathematical software,…
ERIC Educational Resources Information Center
Florin-Thuma, Beth C.; Boudreau, John W.
1987-01-01
Investigated the frequent but previously untested assertion that utility analysis can improve communication and decision making about human resource management programs by examining a performance feedback intervention in a small fast-food store. Results suggest substantial payoffs from performance feedback, though the store's owner-managers had…
ERIC Educational Resources Information Center
Mikus, Robert L.
2014-01-01
Restorative justice philosophy and practices have been utilized in a variety of settings. Legislative reform prompted their application in the criminal and juvenile justice systems. They have also been utilized in employment, education, civic, human services and community settings. While their integration in elementary, intermediate and secondary…
Sankar, M; Chandra, T S
2003-01-01
A detailed analysis was made of chemical fractions of common agro-residues before and after pretreatment (alkali and hydrogen peroxide), and the selective utilization of components such as WSS, EBS, TSS, lignin, cellulose and hemicellulose by pure and mixed cultures of cellulolytic and xylanolytic Clostridia was monitored and correlated with the organisms' enzyme activity. For all cultures pretreatment gave higher utilization of hemicellulose and cellulose fractions; hydrogen peroxide pretreatment was more effective than NaOH treatment. Lignin utilization was not very significant even on pretreatment. C.TM1 and C.SA IV utilized hemicellulose and cellulose better than mixed cultures in selected substrates. These results help to determine the substrate composition, pretreatment conditions and enzyme system of the organism needed when designing an inoculum for agricultural waste treatment processes such as composting or biogas generation.
Trajectory Dispersed Vehicle Process for Space Launch System
NASA Technical Reports Server (NTRS)
Statham, Tamara; Thompson, Seth
2017-01-01
The Space Launch System (SLS) vehicle is part of NASA's deep space exploration plans that includes manned missions to Mars. Manufacturing uncertainties in design parameters are key considerations throughout SLS development as they have significant effects on focus parameters such as lift-off-thrust-to-weight, vehicle payload, maximum dynamic pressure, and compression loads. This presentation discusses how the SLS program captures these uncertainties by utilizing a 3 degree of freedom (DOF) process called Trajectory Dispersed (TD) analysis. This analysis biases nominal trajectories to identify extremes in the design parameters for various potential SLS configurations and missions. This process utilizes a Design of Experiments (DOE) and response surface methodologies (RSM) to statistically sample uncertainties, and develop resulting vehicles using a Maximum Likelihood Estimate (MLE) process for targeting uncertainties bias. These vehicles represent various missions and configurations which are used as key inputs into a variety of analyses in the SLS design process, including 6 DOF dispersions, separation clearances, and engine out failure studies.
Ultrasound-enhanced bioscouring of greige cotton: regression analysis of process factors
USDA-ARS?s Scientific Manuscript database
Ultrasound-enhanced bioscouring process factors for greige cotton fabric are examined using custom experimental design utilizing statistical principles. An equation is presented which predicts bioscouring performance based upon percent reflectance values obtained from UV-Vis measurements of rutheniu...
Graphics Processing Unit Assisted Thermographic Compositing
NASA Technical Reports Server (NTRS)
Ragasa, Scott; McDougal, Matthew; Russell, Sam
2012-01-01
Objective: To develop a software application utilizing general purpose graphics processing units (GPUs) for the analysis of large sets of thermographic data. Background: Over the past few years, an increasing effort among scientists and engineers to utilize the GPU in a more general purpose fashion is allowing for supercomputer level results at individual workstations. As data sets grow, the methods to work them grow at an equal, and often great, pace. Certain common computations can take advantage of the massively parallel and optimized hardware constructs of the GPU to allow for throughput that was previously reserved for compute clusters. These common computations have high degrees of data parallelism, that is, they are the same computation applied to a large set of data where the result does not depend on other data elements. Signal (image) processing is one area were GPUs are being used to greatly increase the performance of certain algorithms and analysis techniques. Technical Methodology/Approach: Apply massively parallel algorithms and data structures to the specific analysis requirements presented when working with thermographic data sets.
Electrophoresis gel image processing and analysis using the KODAK 1D software.
Pizzonia, J
2001-06-01
The present article reports on the performance of the KODAK 1D Image Analysis Software for the acquisition of information from electrophoresis experiments and highlights the utility of several mathematical functions for subsequent image processing, analysis, and presentation. Digital images of Coomassie-stained polyacrylamide protein gels containing molecular weight standards and ethidium bromide stained agarose gels containing DNA mass standards are acquired using the KODAK Electrophoresis Documentation and Analysis System 290 (EDAS 290). The KODAK 1D software is used to optimize lane and band identification using features such as isomolecular weight lines. Mathematical functions for mass standard representation are presented, and two methods for estimation of unknown band mass are compared. Given the progressive transition of electrophoresis data acquisition and daily reporting in peer-reviewed journals to digital formats ranging from 8-bit systems such as EDAS 290 to more expensive 16-bit systems, the utility of algorithms such as Gaussian modeling, which can correct geometric aberrations such as clipping due to signal saturation common at lower bit depth levels, is discussed. Finally, image-processing tools that can facilitate image preparation for presentation are demonstrated.
Flat plate vs. concentrator solar photovoltaic cells - A manufacturing cost analysis
NASA Technical Reports Server (NTRS)
Granon, L. A.; Coleman, M. G.
1980-01-01
The choice of which photovoltaic system (flat plate or concentrator) to use for utilizing solar cells to generate electricity depends mainly on the cost. A detailed, comparative manufacturing cost analysis of the two types of systems is presented. Several common assumptions, i.e., cell thickness, interest rate, power rate, factory production life, polysilicon cost, and direct labor rate are utilized in this analysis. Process sequences, cost variables, and sensitivity analyses have been studied, and results of the latter show that the most important parameters which determine manufacturing costs are concentration ratio, manufacturing volume, and cell efficiency. The total cost per watt of the flat plate solar cell is $1.45, and that of the concentrator solar cell is $1.85, the higher cost being due to the increased process complexity and material costs.
Experimental and Computational Analysis of Modes in a Partially Constrained Plate
2004-03-01
way to quantify a structure. One technique utilizing an energy method is the Statistical Energy Analysis (SEA). The SEA process involves regarding...B.R. Mace. “ Statistical Energy Analysis of Two Edge- Coupled Rectangular Plates: Ensemble Averages,” Journal of Sound and Vibration, 193(4): 793-822
Discovery of 100K SNP array and its utilization in sugarcane
USDA-ARS?s Scientific Manuscript database
Next generation sequencing (NGS) enable us to identify thousands of single nucleotide polymorphisms (SNPs) marker for genotyping and fingerprinting. However, the process requires very precise bioinformatics analysis and filtering process. High throughput SNP array with predefined genomic location co...
Dynamic analysis of process reactors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shadle, L.J.; Lawson, L.O.; Noel, S.D.
1995-06-01
The approach and methodology of conducting a dynamic analysis is presented in this poster session in order to describe how this type of analysis can be used to evaluate the operation and control of process reactors. Dynamic analysis of the PyGas{trademark} gasification process is used to illustrate the utility of this approach. PyGas{trademark} is the gasifier being developed for the Gasification Product Improvement Facility (GPIF) by Jacobs-Siffine Engineering and Riley Stoker. In the first step of the analysis, process models are used to calculate the steady-state conditions and associated sensitivities for the process. For the PyGas{trademark} gasifier, the process modelsmore » are non-linear mechanistic models of the jetting fluidized-bed pyrolyzer and the fixed-bed gasifier. These process sensitivities are key input, in the form of gain parameters or transfer functions, to the dynamic engineering models.« less
Analysis of Low-Temperature Utilization of Geothermal Resources
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, Brian
Full realization of the potential of what might be considered “low-grade” geothermal resources will require that we examine many more uses for the heat than traditional electricity generation. To demonstrate that geothermal energy truly has the potential to be a national energy source we will be designing, assessing, and evaluating innovative uses for geothermal-produced water such as hybrid biomass-geothermal cogeneration of electricity and district heating and efficiency improvements to the use of cellulosic biomass in addition to utilization of geothermal in district heating for community redevelopment projects. The objectives of this project were: 1) to perform a techno-economic analysis ofmore » the integration and utilization potential of low-temperature geothermal sources. Innovative uses of low-enthalpy geothermal water were designed and examined for their ability to offset fossil fuels and decrease CO2 emissions. 2) To perform process optimizations and economic analyses of processes that can utilize low-temperature geothermal fluids. These processes included electricity generation using biomass and district heating systems. 3) To scale up and generalize the results of three case study locations to develop a regionalized model of the utilization of low-temperature geothermal resources. A national-level, GIS-based, low-temperature geothermal resource supply model was developed and used to develop a series of national supply curves. We performed an in-depth analysis of the low-temperature geothermal resources that dominate the eastern half of the United States. The final products of this study include 17 publications, an updated version of the cost estimation software GEOPHIRES, and direct-use supply curves for low-temperature utilization of geothermal resources. The supply curves for direct use geothermal include utilization from known hydrothermal, undiscovered hydrothermal, and near-hydrothermal EGS resources and presented these results at the Stanford Geothermal Workshop. We also have incorporated our wellbore model into TOUGH2-EGS and began coding TOUGH2-EGS with the wellbore model into GEOPHIRES as a reservoir thermal drawdown option. Additionally, case studies for the WVU and Cornell campuses were performed to assess the potential for district heating and cooling at these two eastern U.S. sites.« less
Schwarz, Patric; Pannes, Klaus Dieter; Nathan, Michel; Reimer, Hans Jorg; Kleespies, Axel; Kuhn, Nicole; Rupp, Anne; Zügel, Nikolaus Peter
2011-10-01
The decision to optimize the processes in the operating tract was based on two factors: competition among clinics and a desire to optimize the use of available resources. The aim of the project was to improve operating room (OR) capacity utilization by reduction of change and throughput time per patient. The study was conducted at Centre Hospitalier Emil Mayrisch Clinic for specialized care (n = 618 beds) Luxembourg (South). A prospective analysis was performed before and after the implementation of optimized processes. Value stream analysis and design (value stream mapping, VSM) were used as tools. VSM depicts patient throughput and the corresponding information flows. Furthermore it is used to identify process waste (e.g. time, human resources, materials, etc.). For this purpose, change times per patient (extubation of patient 1 until intubation of patient 2) and throughput times (inward transfer until outward transfer) were measured. VSM, change and throughput times for 48 patient flows (VSM A(1), actual state = initial situation) served as the starting point. Interdisciplinary development of an optimized VSM (VSM-O) was evaluated. Prospective analysis of 42 patients (VSM-A(2)) without and 75 patients (VSM-O) with an optimized process in place were conducted. The prospective analysis resulted in a mean change time of (mean ± SEM) VSM-A(2) 1,507 ± 100 s versus VSM-O 933 ± 66 s (p < 0.001). The mean throughput time VSM-A(2) (mean ± SEM) was 151 min (±8) versus VSM-O 120 min (±10) (p < 0.05). This corresponds to a 23% decrease in waiting time per patient in total. Efficient OR capacity utilization and the optimized use of human resources allowed an additional 1820 interventions to be carried out per year without any increase in human resources. In addition, perioperative patient monitoring was increased up to 100%.
Graphics Processing Unit Assisted Thermographic Compositing
NASA Technical Reports Server (NTRS)
Ragasa, Scott; McDougal, Matthew; Russell, Sam
2013-01-01
Objective: To develop a software application utilizing general purpose graphics processing units (GPUs) for the analysis of large sets of thermographic data. Background: Over the past few years, an increasing effort among scientists and engineers to utilize the GPU in a more general purpose fashion is allowing for supercomputer level results at individual workstations. As data sets grow, the methods to work them grow at an equal, and often greater, pace. Certain common computations can take advantage of the massively parallel and optimized hardware constructs of the GPU to allow for throughput that was previously reserved for compute clusters. These common computations have high degrees of data parallelism, that is, they are the same computation applied to a large set of data where the result does not depend on other data elements. Signal (image) processing is one area were GPUs are being used to greatly increase the performance of certain algorithms and analysis techniques.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sadowski, F.G.; Covington, S.J.
1987-01-01
Advanced digital processing techniques were applied to Landsat-5 Thematic Mapper (TM) data and SPOT high-resolution visible (HRV) panchromatic data to maximize the utility of images of a nuclear power plant emergency at Chernobyl in the Soviet Ukraine. The results of the data processing and analysis illustrate the spectral and spatial capabilities of the two sensor systems and provide information about the severity and duration of the events occurring at the power plant site.
Merlyn J. Paulson
1979-01-01
This paper outlines a project level process (V.I.S.) which utilizes very accurate and flexible computer algorithms in combination with contemporary site analysis and design techniques for visual evaluation, design and management. The process provides logical direction and connecting bridges through problem identification, information collection and verification, visual...
Numerical orbit generators of artificial earth satellites
NASA Astrophysics Data System (ADS)
Kugar, H. K.; Dasilva, W. C. C.
1984-04-01
A numerical orbit integrator containing updatings and improvements relative to the previous ones that are being utilized by the Departmento de Mecanica Espacial e Controle (DMC), of INPE, besides incorporating newer modellings resulting from the skill acquired along the time is presented. Flexibility and modularity were taken into account in order to allow future extensions and modifications. Characteristics of numerical accuracy, processing quickness, memory saving as well as utilization aspects were also considered. User's handbook, whole program listing and qualitative analysis of accuracy, processing time and orbit perturbation effects were included as well.
Meta-Analysis in Higher Education: An Illustrative Example Using Hierarchical Linear Modeling
ERIC Educational Resources Information Center
Denson, Nida; Seltzer, Michael H.
2011-01-01
The purpose of this article is to provide higher education researchers with an illustrative example of meta-analysis utilizing hierarchical linear modeling (HLM). This article demonstrates the step-by-step process of meta-analysis using a recently-published study examining the effects of curricular and co-curricular diversity activities on racial…
Economic analysis of open space box model utilization in spacecraft
NASA Astrophysics Data System (ADS)
Mohammad, Atif F.; Straub, Jeremy
2015-05-01
It is a known fact that the amount of data about space that is stored is getting larger on an everyday basis. However, the utilization of Big Data and related tools to perform ETL (Extract, Transform and Load) applications will soon be pervasive in the space sciences. We have entered in a crucial time where using Big Data can be the difference (for terrestrial applications) between organizations underperforming and outperforming their peers. The same is true for NASA and other space agencies, as well as for individual missions and the highly-competitive process of mission data analysis and publication. In most industries, conventional opponents and new candidates alike will influence data-driven approaches to revolutionize and capture the value of Big Data archives. The Open Space Box Model is poised to take the proverbial "giant leap", as it provides autonomic data processing and communications for spacecraft. We can find economic value generated from such use of data processing in our earthly organizations in every sector, such as healthcare, retail. We also can easily find retailers, performing research on Big Data, by utilizing sensors driven embedded data in products within their stores and warehouses to determine how these products are actually used in the real world.
[Thermal energy utilization analysis and energy conservation measures of fluidized bed dryer].
Xing, Liming; Zhao, Zhengsheng
2012-07-01
To propose measures for enhancing thermal energy utilization by analyzing drying process and operation principle of fluidized bed dryers,in order to guide optimization and upgrade of fluidized bed drying equipment. Through a systematic analysis on drying process and operation principle of fluidized beds,the energy conservation law was adopted to calculate thermal energy of dryers. The thermal energy of fluidized bed dryers is mainly used to make up for thermal consumption of water evaporation (Qw), hot air from outlet equipment (Qe), thermal consumption for heating and drying wet materials (Qm) and heat dissipation to surroundings through hot air pipelines and cyclone separators. Effective measures and major approaches to enhance thermal energy utilization of fluidized bed dryers were to reduce exhaust gas out by the loss of heat Qe, recycle dryer export air quantity of heat, preserve heat for dry towers, hot air pipes and cyclone separators, dehumidify clean air in inlets and reasonably control drying time and air temperature. Such technical parameters such air supply rate, air inlet temperature and humidity, material temperature and outlet temperature and humidity are set and controlled to effectively save energy during the drying process and reduce the production cost.
ERIC Educational Resources Information Center
Ingram, Julie; Maye, Damian; Kirwan, James; Curry, Nigel; Kubinakova, Katarina
2014-01-01
Purpose: This article utilizes the Communities of Practice (CoP) framework to examine learning processes among a group of permaculture practitioners in England, specifically examining the balance between core practices and boundary processes. Design/methodology/approach: The empirical basis of the article derives from three participatory workshops…
Flexible Software Architecture for Visualization and Seismic Data Analysis
NASA Astrophysics Data System (ADS)
Petunin, S.; Pavlov, I.; Mogilenskikh, D.; Podzyuban, D.; Arkhipov, A.; Baturuin, N.; Lisin, A.; Smith, A.; Rivers, W.; Harben, P.
2007-12-01
Research in the field of seismology requires software and signal processing utilities for seismogram manipulation and analysis. Seismologists and data analysts often encounter a major problem in the use of any particular software application specific to seismic data analysis: the tuning of commands and windows to the specific waveforms and hot key combinations so as to fit their familiar informational environment. The ability to modify the user's interface independently from the developer requires an adaptive code structure. An adaptive code structure also allows for expansion of software capabilities such as new signal processing modules and implementation of more efficient algorithms. Our approach is to use a flexible "open" architecture for development of geophysical software. This report presents an integrated solution for organizing a logical software architecture based on the Unix version of the Geotool software implemented on the Microsoft NET 2.0 platform. Selection of this platform greatly expands the variety and number of computers that can implement the software, including laptops that can be utilized in field conditions. It also facilitates implementation of communication functions for seismic data requests from remote databases through the Internet. The main principle of the new architecture for Geotool is that scientists should be able to add new routines for digital waveform analysis via software plug-ins that utilize the basic Geotool display for GUI interaction. The use of plug-ins allows the efficient integration of diverse signal-processing software, including software still in preliminary development, into an organized platform without changing the fundamental structure of that platform itself. An analyst's use of Geotool is tracked via a metadata file so that future studies can reconstruct, and alter, the original signal processing operations. The work has been completed in the framework of a joint Russian- American project.
Cost analysis of advanced turbine blade manufacturing processes
NASA Technical Reports Server (NTRS)
Barth, C. F.; Blake, D. E.; Stelson, T. S.
1977-01-01
A rigorous analysis was conducted to estimate relative manufacturing costs for high technology gas turbine blades prepared by three candidate materials process systems. The manufacturing costs for the same turbine blade configuration of directionally solidified eutectic alloy, an oxide dispersion strengthened superalloy, and a fiber reinforced superalloy were compared on a relative basis to the costs of the same blade currently in production utilizing the directional solidification process. An analytical process cost model was developed to quantitatively perform the cost comparisons. The impact of individual process yield factors on costs was also assessed as well as effects of process parameters, raw materials, labor rates and consumable items.
Higgins, A; Barnett, J; Meads, C; Singh, J; Longworth, L
2014-12-01
To systematically review the existing literature on the value associated with convenience in health care delivery, independent of health outcomes, and to try to estimate the likely magnitude of any value found. A systematic search was conducted for previously published studies that reported preferences for convenience-related aspects of health care delivery in a manner that was consistent with either cost-utility analysis or cost-benefit analysis. Data were analyzed in terms of the methodologies used, the aspects of convenience considered, and the values reported. Literature searches generated 4715 records. Following a review of abstracts or full-text articles, 27 were selected for inclusion. Twenty-six studies reported some evidence of convenience-related process utility, in the form of either a positive utility or a positive willingness to pay. The aspects of convenience valued most often were mode of administration (n = 11) and location of treatment (n = 6). The most common valuation methodology was a discrete-choice experiment containing a cost component (n = 15). A preference for convenience-related process utility exists, independent of health outcomes. Given the diverse methodologies used to calculate it, and the range of aspects being valued, however, it is difficult to assess how large such a preference might be, or how it may be effectively incorporated into an economic evaluation. Increased consistency in reporting these preferences is required to assess these issues more accurately. Copyright © 2014 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
This Applications Analysis Report evaluates the solidification/stabilization treatment process of Silicate Technology Corporation (STC) for the on-site treatment of hazardous waste. The STC immobilization technology utilizes a proprietary product (FMS Silicate) to chemically stab...
Zhang, Jian; Fang, Zhenhong; Deng, Hongbo; Zhang, Xiaoxi; Bao, Jie
2013-04-01
Cassava cellulose accounts for one quarter of cassava residues and its utilization is important for improving the efficiency and profit in commercial scale cassava ethanol industry. In this study, three scenarios of cassava cellulose utilization for ethanol production were experimentally tested under same conditions and equipment. Based on the experimental results, a rigorous flowsheet simulation model was established on Aspen plus platform and the cost of cellulase enzyme and steam energy in the three cases was calculated. The results show that the simultaneous co-saccharification of cassava starch/cellulose and ethanol fermentation process (Co-SSF) provided a cost effective option of cassava cellulose utilization for ethanol production, while the utilization of cassava cellulose from cassava ethanol fermentation residues was not economically sound. Comparing to the current fuel ethanol selling price, the Co-SSF process may provide an important choice for enhancing cassava ethanol production efficiency and profit in commercial scale. Copyright © 2013 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Layton, Mark H.
2012-07-01
The F-Area Tank Farm (FTF) is owned by the U.S. Department of Energy and operated by Savannah River Remediation, LLC (SRR), Liquid Waste Operations contractor at DOE's Savannah River Site (SRS). The FTF is in the north-central portion of the SRS and occupies approximately 22 acres within F-Area. The FTF is an active radioactive waste storage facility consisting of 22 carbon steel waste tanks and ancillary equipment such as transfer lines, evaporators and pump tanks. An FTF Performance Assessment (PA) was prepared to support the eventual closure of the FTF underground radioactive waste tanks and ancillary equipment. The PA providesmore » the technical basis and results to be used in subsequent documents to demonstrate compliance with the pertinent requirements identified below for final closure of FTF. The FTank Farm is subject to a state industrial waste water permit and Federal Facility Agreement. Closure documentation will include an F-Tank Farm Closure Plan and tank-specific closure modules utilizing information from the performance assessment. For this reason, the State of South Carolina and the Environmental Protection Agency must be involved in the performance assessment review process. The residual material remaining after tank cleaning is also subject to reclassification prior to closure via a waste determination pursuant to Section 3116 of the Ronald W. Reagan National Defense Authorization Act of Fiscal Year 2005. The projected waste tank inventories in the FTF PA provide reasonably bounding FTF inventory projections while taking into account uncertainties in the effectiveness of future tank cleaning technologies. As waste is removed from the FTF waste tanks, the residual contaminants will be sampled and the remaining residual inventory is characterized. In this manner, tank specific data for the tank inventories at closure will be available to supplement the waste tank inventory projections currently used in the FTF PA. For FTF, the new tank specific data will be evaluated through the Special Analysis process. The FTF Special Analyses process will be utilized to evaluate information regarding the final residual waste that will be grouted in place in the FTF Tanks and assess the potential impact the new inventory information has on the FTF PA assumptions and results. The Special Analysis can then be used to inform decisions regarding FTF tank closure documents. The purpose of this paper is to discuss the Special Analysis process and share insights gained while implementing this process. An example of an area of interest in the revision process is balancing continuous improvement versus configuration control of agreed upon methodologies. Other subjects to be covered include: 1) defining the scope of the revisions included in the Special Analysis, 2) determining which PA results should be addressed in the Special Analysis, and 3) deciding whether the Special Analysis should utilize more qualitative or quantitative assessments. For the SRS FTF, an FTF PA has been prepared to provide the technical basis and results to be used in subsequent documents to demonstrate compliance with the pertinent requirements for final closure of FTF. The FTF Special Analyses process will be utilized to evaluate the impact new information has on the FTF PA assumptions and results. The Special Analysis can then be used to inform decisions regarding FTF tank closure documents. In preparing SAs, it is crucial that the scope of the SA be well defined within the SA, since the specific scope will vary from SA to SA. Since the SAs are essentially addendums to the PA, the SA scope should utilize the PA as the baseline from which the SA scope is defined. The SA needs to focus on evaluating the change associated with the scope, and not let other changes interfere with the ability to perform that evaluation by masking the impact of the change. In preparing the SA, it is also important to let the scope determine whether the Special Analysis should utilize more qualitative or quantitative assessments and also which results from the PA should be addressed in the Special Analysis. These decisions can vary from SA and should not be predetermined. (author)« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Riddle, F. J.
2003-06-26
The Automated Hazard Analysis (AHA) application is a software tool used to conduct job hazard screening and analysis of tasks to be performed in Savannah River Site facilities. The AHA application provides a systematic approach to the assessment of safety and environmental hazards associated with specific tasks, and the identification of controls regulations, and other requirements needed to perform those tasks safely. AHA is to be integrated into existing Savannah River site work control and job hazard analysis processes. Utilization of AHA will improve the consistency and completeness of hazard screening and analysis, and increase the effectiveness of the workmore » planning process.« less
Space shuttle main engine numerical modeling code modifications and analysis
NASA Technical Reports Server (NTRS)
Ziebarth, John P.
1988-01-01
The user of computational fluid dynamics (CFD) codes must be concerned with the accuracy and efficiency of the codes if they are to be used for timely design and analysis of complicated three-dimensional fluid flow configurations. A brief discussion of how accuracy and efficiency effect the CFD solution process is given. A more detailed discussion of how efficiency can be enhanced by using a few Cray Research Inc. utilities to address vectorization is presented and these utilities are applied to a three-dimensional Navier-Stokes CFD code (INS3D).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shoaf, S.; APS Engineering Support Division
A real-time image analysis system was developed for beam imaging diagnostics. An Apple Power Mac G5 with an Active Silicon LFG frame grabber was used to capture video images that were processed and analyzed. Software routines were created to utilize vector-processing hardware to reduce the time to process images as compared to conventional methods. These improvements allow for more advanced image processing diagnostics to be performed in real time.
Analysis of the PPBE Process in the Current Dynamic Political Environment
2008-06-01
44 B. PESTEL ANALYSIS OF THE POST 9/11 ENVIRONMENT ..................45 1. Political...Socio-Cultural, Technological, Ecological, and Legal ( PESTEL ) Analysis was utilized to define the 1960’s domestic and DoD environment ; the DoD... environment after 9/11 is also analyzed. The PESTEL Analysis identifies the factors that combine to form these respective environments . Scholarly
Systematic donor selection review process improves cardiac transplant volumes and outcomes.
Smith, Jason W; O'Brien, Kevin D; Dardas, Todd; Pal, Jay D; Fishbein, Daniel P; Levy, Wayne C; Mahr, Claudius; Masri, Sofia C; Cheng, Richard K; Stempien-Otero, April; Mokadam, Nahush A
2016-01-01
Heart transplant remains the definitive therapy for advanced heart failure patients but is limited by organ availability. We identified a large number of donor hearts from our organ procurement organization (OPO) being exported to other regions. We engaged a multidisciplinary team including transplant surgeons, cardiologists, and our OPO colleagues to identify opportunities to improve our center-specific organ utilization rate. We performed a retrospective analysis of donor offers before and after institution of a novel review process. Each donor offer made to our program was reviewed on a monthly basis from July 2013 to June 2014 and compared with the previous year. This review process resulted in a transplant utilization rate of 28% for period 1 versus 49% for period 2 (P = .007). Limiting the analysis to offers from our local OPO changed our utilization rate from 46% to 75% (P = .02). Transplant volume increased from 22 to 35 between the 2 study periods. Thirty-day and 1-year mortality were unchanged over the 2 periods. A total of 58 hearts were refused by our center and transplanted at other centers. During period 1, the 30-day and 1-year survival rates for recipients of those organs were 98% and 90%, respectively, comparable with our historical survival data. The simple process of systematically reviewing donor turndown events as a group tended to reduce variability, increase confidence in expanded criteria for donors, and resulted in improved donor organ utilization and transplant volumes. Copyright © 2016 The American Association for Thoracic Surgery. Published by Elsevier Inc. All rights reserved.
Li, Li; Nguyen, Kim-Huong; Comans, Tracy; Scuffham, Paul
2018-04-01
Several utility-based instruments have been applied in cost-utility analysis to assess health state values for people with dementia. Nevertheless, concerns and uncertainty regarding their performance for people with dementia have been raised. To assess the performance of available utility-based instruments for people with dementia by comparing their psychometric properties and to explore factors that cause variations in the reported health state values generated from those instruments by conducting meta-regression analyses. A literature search was conducted and psychometric properties were synthesized to demonstrate the overall performance of each instrument. When available, health state values and variables such as the type of instrument and cognitive impairment levels were extracted from each article. A meta-regression analysis was undertaken and available covariates were included in the models. A total of 64 studies providing preference-based values were identified and included. The EuroQol five-dimension questionnaire demonstrated the best combination of feasibility, reliability, and validity. Meta-regression analyses suggested that significant differences exist between instruments, type of respondents, and mode of administration and the variations in estimated utility values had influences on incremental quality-adjusted life-year calculation. This review finds that the EuroQol five-dimension questionnaire is the most valid utility-based instrument for people with dementia, but should be replaced by others under certain circumstances. Although no utility estimates were reported in the article, the meta-regression analyses that examined variations in utility estimates produced by different instruments impact on cost-utility analysis, potentially altering the decision-making process in some circumstances. Copyright © 2018 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
USDA-ARS?s Scientific Manuscript database
Capillary electrophoresis (CE) and reversed-phase high performance liquid chromatography (RP-HPLC) analysis were utilized to detect differences in the sarcoplasmic protein profiles of beef strip loins subjected to aging and hydrodynamic pressure processing (HDP) treatments. At 48 h postmortem, stri...
The New Southern FIA Data Compilation System
V. Clark Baldwin; Larry Royer
2001-01-01
In general, the major national Forest Inventory and Analysis annual inventory emphasis has been on data-base design and not on data processing and calculation of various new attributes. Two key programming techniques required for efficient data processing are indexing and modularization. The Southern Research Station Compilation System utilizes modular and indexing...
DOT National Transportation Integrated Search
2017-11-01
The traditional process of identifying corridors for road diet improvements involves selecting potential corridors (mostly based on identifying fourlane roads) and conducting a traffic impact analysis of proposed changes on a selected roadway before ...
NASA Astrophysics Data System (ADS)
Gerber, S.; Holsman, J. P.
1981-02-01
A proposed design analysis is presented of a passive solar energy efficient system for a typical three level, three bedroom, two story, garage under townhouse. The design incorporates the best, most performance proven and cost effective products, materials, processes, technologies, and subsystems which are available today. Seven distinct categories recognized for analysis are identified as: the exterior environment; the interior environment; conservation of energy; natural energy utilization; auxiliary energy utilization; control and distribution systems; and occupant adaptation. Preliminary design features, fenestration systems, the plenum supply system, the thermal storage party fire walls, direct gain storage, the radiant comfort system, and direct passive cooling systems are briefly described.
2002-12-01
Accounting and Reporting System-Field Level SWOT Strengths Weaknesses Opportunities Threats TMA Tricare Management Activity TOA Total Obligational...progression of the four principles. [Ref 3] The organization uses SWOT analysis to assist in developing the mission and business...strategy. SWOT stands for the strengths and weaknesses of the organization and the opportunities for and threats to the organization
State Share of Instruction Funding to Ohio Public Community Colleges: A Policy Analysis
ERIC Educational Resources Information Center
Johnson, Betsy
2012-01-01
This study investigated various state policies to determine their impact on the state share of instruction (SSI) funding to community colleges in the state of Ohio. To complete the policy analysis, the researcher utilized three policy analysis tools, defined by Gill and Saunders (2010) as iterative processes, intuition and judgment, and advice and…
Micro hollow cathode discharge jets utilizing solid fuel
NASA Astrophysics Data System (ADS)
Nikic, Dejan
2017-10-01
Micro hollow cathode discharge devices with a solid fuel layer embedded between the electrodes have demonstrated an enhanced jetting process. Outlined are series of experiments in various pressure and gas conditions as well as vacuum. Examples of use of these devices in series and parallel configurations are presented. Evidence of utilization of solid fuel is obtained through optical spectroscopy and analysis of remaining fuel layer.
ERIC Educational Resources Information Center
Sokolowski, Andrzej; Li, Yeping; Willson, Victor
2015-01-01
Background: The process of problem solving is difficult for students; thus, mathematics educators have made multiple attempts to seek ways of making this process more accessible to learners. The purpose of this study was to examine the effect size statistic of utilizing exploratory computerized environments (ECEs) to support the process of word…
NASA Astrophysics Data System (ADS)
Ashraf, M. A. M.; Kumar, N. S.; Yusoh, R.; Hazreek, Z. A. M.; Aziman, M.
2018-04-01
Site classification utilizing average shear wave velocity (Vs(30) up to 30 meters depth is a typical parameter. Numerous geophysical methods have been proposed for estimation of shear wave velocity by utilizing assortment of testing configuration, processing method, and inversion algorithm. Multichannel Analysis of Surface Wave (MASW) method is been rehearsed by numerous specialist and professional to geotechnical engineering for local site characterization and classification. This study aims to determine the site classification on soft and hard ground using MASW method. The subsurface classification was made utilizing National Earthquake Hazards Reduction Program (NERHP) and international Building Code (IBC) classification. Two sites are chosen to acquire the shear wave velocity which is in the state of Pulau Pinang for soft soil and Perlis for hard rock. Results recommend that MASW technique can be utilized to spatially calculate the distribution of shear wave velocity (Vs(30)) in soil and rock to characterize areas.
Dockres: a computer program that analyzes the output of virtual screening of small molecules
2010-01-01
Background This paper describes a computer program named Dockres that is designed to analyze and summarize results of virtual screening of small molecules. The program is supplemented with utilities that support the screening process. Foremost among these utilities are scripts that run the virtual screening of a chemical library on a large number of processors in parallel. Methods Dockres and some of its supporting utilities are written Fortran-77; other utilities are written as C-shell scripts. They support the parallel execution of the screening. The current implementation of the program handles virtual screening with Autodock-3 and Autodock-4, but can be extended to work with the output of other programs. Results Analysis of virtual screening by Dockres led to both active and selective lead compounds. Conclusions Analysis of virtual screening was facilitated and enhanced by Dockres in both the authors' laboratories as well as laboratories elsewhere. PMID:20205801
Metagenomic analysis of the rhizosphere soil microbiome with respect to phytic acid utilization.
Unno, Yusuke; Shinano, Takuro
2013-01-01
While phytic acid is a major form of organic phosphate in many soils, plant utilization of phytic acid is normally limited; however, culture trials of Lotus japonicus using experimental field soil that had been managed without phosphate fertilizer for over 90 years showed significant usage of phytic acid applied to soil for growth and flowering and differences in the degree of growth, even in the same culture pot. To understand the key metabolic processes involved in soil phytic acid utilization, we analyzed rhizosphere soil microbial communities using molecular ecological approaches. Although molecular fingerprint analysis revealed changes in the rhizosphere soil microbial communities from bulk soil microbial community, no clear relationship between the microbiome composition and flowering status that might be related to phytic acid utilization of L. japonicus could be determined. However, metagenomic analysis revealed changes in the relative abundance of the classes Bacteroidetes, Betaproteobacteria, Chlorobi, Dehalococcoidetes and Methanobacteria, which include strains that potentially promote plant growth and phytic acid utilization, and some gene clusters relating to phytic acid utilization, such as alkaline phosphatase and citrate synthase, with the phytic acid utilization status of the plant. This study highlights phylogenetic and metabolic features of the microbial community of the L. japonicus rhizosphere and provides a basic understanding of how rhizosphere microbial communities affect the phytic acid status in soil.
NASA Astrophysics Data System (ADS)
Obracaj, Piotr; Fabianowski, Dariusz
2017-10-01
Implementations concerning adaptation of historic facilities for public utility objects are associated with the necessity of solving many complex, often conflicting expectations of future users. This mainly concerns the function that includes construction, technology and aesthetic issues. The list of issues is completed with proper protection of historic values, different in each case. The procedure leading to obtaining the expected solution is a multicriteria procedure, usually difficult to accurately define and requiring designer’s large experience. An innovative approach has been used for the analysis, namely - the modified EA FAHP (Extent Analysis Fuzzy Analytic Hierarchy Process) Chang’s method of a multicriteria analysis for the assessment of complex functional and spatial issues. Selection of optimal spatial form of an adapted historic building intended for the multi-functional public utility facility was analysed. The assumed functional flexibility was determined in the scope of: education, conference, and chamber spectacles, such as drama, concerts, in different stage-audience layouts.
Weernink, Marieke G M; Groothuis-Oudshoorn, Catharina G M; IJzerman, Maarten J; van Til, Janine A
2016-01-01
The objective of this study was to compare treatment profiles including both health outcomes and process characteristics in Parkinson disease using best-worst scaling (BWS), time trade-off (TTO), and visual analogue scales (VAS). From the model comprising of seven attributes with three levels, six unique profiles were selected representing process-related factors and health outcomes in Parkinson disease. A Web-based survey (N = 613) was conducted in a general population to estimate process-related utilities using profile-based BWS (case 2), multiprofile-based BWS (case 3), TTO, and VAS. The rank order of the six profiles was compared, convergent validity among methods was assessed, and individual analysis focused on the differentiation between pairs of profiles with methods used. The aggregated health-state utilities for the six treatment profiles were highly comparable for all methods and no rank reversals were identified. On the individual level, the convergent validity between all methods was strong; however, respondents differentiated less in the utility of closely related treatment profiles with a VAS or TTO than with BWS. For TTO and VAS, this resulted in nonsignificant differences in mean utilities for closely related treatment profiles. This study suggests that all methods are equally able to measure process-related utility when the aim is to estimate the overall value of treatments. On an individual level, such as in shared decision making, BWS allows for better prioritization of treatment alternatives, especially if they are closely related. The decision-making problem and the need for explicit trade-off between attributes should determine the choice for a method. Copyright © 2016. Published by Elsevier Inc.
A Qualitative Analysis of College Women's Leaving Processes in Abusive Relationships
ERIC Educational Resources Information Center
Edwards, Katie M.; Murphy, Megan J.; Tansill, Erin C.; Myrick, Christina; Probst, Danielle R.; Corsa, Rebecca; Gidycz, Christine A.
2012-01-01
Objective: This study assessed the process of leaving an abusive dating relationship utilizing a qualitative design. Methods: Participants included 123 college women in abusive dating relationships who participated at the beginning and end of a 10-week academic quarter. Results: Qualitative content analyses were used to analyze the transcribed…
The Cognitive-Miser Response Model: Testing for Intuitive and Deliberate Reasoning
ERIC Educational Resources Information Center
Bockenholt, Ulf
2012-01-01
In a number of psychological studies, answers to reasoning vignettes have been shown to result from both intuitive and deliberate response processes. This paper utilizes a psychometric model to separate these two response tendencies. An experimental application shows that the proposed model facilitates the analysis of dual-process item responses…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brow, R.K.; Kovacic, L.; Chambers, R.S.
1996-04-01
Hernetic glass sealing technologies developed for weapons component applications can be utilized for the design and manufacture of fuel cells. Design and processing of of a seal are optimized through an integrated approach based on glass composition research, finite element analysis, and sealing process definition. Glass sealing procedures are selected to accommodate the limits imposed by glass composition and predicted calculations.
NASA Technical Reports Server (NTRS)
Schubert, Matthew R.; Moore, Andrew J.
2015-01-01
Electron cascades from electrical discharge produce secondary emissions from atmospheric plasma in the ultraviolet band. For a single point of discharge, these emissions exhibit a stereotypical discharge morphology, with latent information about the discharge location. Morphological processing can uncover the location and therefore can have diagnostic utility.
Schubert, Matthew; Moore, Andrew J
2016-03-01
Electron cascades from electrical discharge produce secondary emissions from atmospheric plasma in the ultraviolet band. For a single point of discharge, these emissions exhibit a stereotypical discharge morphology, with latent information about the discharge location. Morphological processing can uncover the location and therefore have diagnostic utility.
Analysis of Alternatives (AoA) Process Improvement Study
2016-12-01
stakeholders, and mapped the process activities and durations. We tasked the SAG members with providing the information required on case studies and...are the expected time saves/cost/risk of any changes? (3) Utilization of case studies for both “good” and “challenged” AoAs to identify lessons...16 4 CASE STUDIES
Utilizing the Theoretical Framework of Collective Identity to Understand Processes in Youth Programs
ERIC Educational Resources Information Center
Futch, Valerie A.
2016-01-01
This article explores collective identity as a useful theoretical framework for understanding social and developmental processes that occur in youth programs. Through narrative analysis of past participant interviews (n = 21) from an after-school theater program, known as "The SOURCE", it was found that participants very clearly describe…
Finite element analysis as a design tool for thermoplastic vulcanizate glazing seals
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gase, K.M.; Hudacek, L.L.; Pesevski, G.T.
1998-12-31
There are three materials that are commonly used in commercial glazing seals: EPDM, silicone and thermoplastic vulcanizates (TPVs). TPVs are a high performance class of thermoplastic elastomers (TPEs), where TPEs have elastomeric properties with thermoplastic processability. TPVs have emerged as materials well suited for use in glazing seals due to ease of processing, economics and part design flexibility. The part design and development process is critical to ensure that the chosen TPV provides economics, quality and function in demanding environments. In the design and development process, there is great value in utilizing dual durometer systems to capitalize on the benefitsmore » of soft and rigid materials. Computer-aided design tools, such as Finite Element Analysis (FEA), are effective in minimizing development time and predicting system performance. Examples of TPV glazing seals will illustrate the benefits of utilizing FEA to take full advantage of the material characteristics, which results in functional performance and quality while reducing development iterations. FEA will be performed on two glazing seal profiles to confirm optimum geometry.« less
Ground and Range Operations for a Heavy-Lift Vehicle: Preliminary Thoughts
NASA Technical Reports Server (NTRS)
Rabelo, Luis; Zhu, Yanshen; Compton, Jeppie; Bardina, Jorge
2011-01-01
This paper discusses the ground and range operations for a Shuttle derived Heavy-Lift Vehicle being launched from the Kennedy Space Center on the Eastern range. Comparisons will be made between the Shuttle and a heavy lift configuration (SLS-ETF MPCV April 2011) by contrasting their subsystems. The analysis will also describe a simulation configuration with the potential to be utilized for heavy lift vehicle processing/range simulation modeling and the development of decision-making systems utilized by the range. In addition, a simple simulation model is used to provide the required critical thinking foundations for this preliminary analysis.
A risk analysis approach applied to field surveillance in utility meters in legal metrology
NASA Astrophysics Data System (ADS)
Rodrigues Filho, B. A.; Nonato, N. S.; Carvalho, A. D.
2018-03-01
Field surveillance represents the level of control in metrological supervision responsible for checking the conformity of measuring instruments in-service. Utility meters represent the majority of measuring instruments produced by notified bodies due to self-verification in Brazil. They play a major role in the economy once electricity, gas and water are the main inputs to industries in their production processes. Then, to optimize the resources allocated to control these devices, the present study applied a risk analysis in order to identify among the 11 manufacturers notified to self-verification, the instruments that demand field surveillance.
NASA Technical Reports Server (NTRS)
Brooner, W. G.; Nichols, D. A.
1972-01-01
Development of a scheme for utilizing remote sensing technology in an operational program for regional land use planning and land resource management program applications. The scheme utilizes remote sensing imagery as one of several potential inputs to derive desired and necessary data, and considers several alternative approaches to the expansion and/or reduction and analysis of data, using automated data handling techniques. Within this scheme is a five-stage program development which includes: (1) preliminary coordination, (2) interpretation and encoding, (3) creation of data base files, (4) data analysis and generation of desired products, and (5) applications.
Photographic Technology and the Research Process
ERIC Educational Resources Information Center
Noss, Jerome
1974-01-01
Description of photogrammetric analyses which, combined with the current emergence of biomechanics, is utilized to explain and measure photographs of human movement. Oriented towards the use of photogrammetric analysis in physical education research. (JA)
Heterodyne laser spectroscopy system
Wyeth, Richard W.; Paisner, Jeffrey A.; Story, Thomas
1990-01-01
A heterodyne laser spectroscopy system utilizes laser heterodyne techniques for purposes of laser isotope separation spectroscopy, vapor diagnostics, processing of precise laser frequency offsets from a reference frequency, and provides spectral analysis of a laser beam.
Silicon production process evaluations
NASA Technical Reports Server (NTRS)
1982-01-01
Chemical engineering analyses involving the preliminary process design of a plant (1,000 metric tons/year capacity) to produce silicon via the technology under consideration were accomplished. Major activities in the chemical engineering analyses included base case conditions, reaction chemistry, process flowsheet, material balance, energy balance, property data, equipment design, major equipment list, production labor and forward for economic analysis. The process design package provided detailed data for raw materials, utilities, major process equipment and production labor requirements necessary for polysilicon production in each process.
Parallel Index and Query for Large Scale Data Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chou, Jerry; Wu, Kesheng; Ruebel, Oliver
2011-07-18
Modern scientific datasets present numerous data management and analysis challenges. State-of-the-art index and query technologies are critical for facilitating interactive exploration of large datasets, but numerous challenges remain in terms of designing a system for process- ing general scientific datasets. The system needs to be able to run on distributed multi-core platforms, efficiently utilize underlying I/O infrastructure, and scale to massive datasets. We present FastQuery, a novel software framework that address these challenges. FastQuery utilizes a state-of-the-art index and query technology (FastBit) and is designed to process mas- sive datasets on modern supercomputing platforms. We apply FastQuery to processing ofmore » a massive 50TB dataset generated by a large scale accelerator modeling code. We demonstrate the scalability of the tool to 11,520 cores. Motivated by the scientific need to search for inter- esting particles in this dataset, we use our framework to reduce search time from hours to tens of seconds.« less
Greenwood, Taylor J; Lopez-Costa, Rodrigo I; Rhoades, Patrick D; Ramírez-Giraldo, Juan C; Starr, Matthew; Street, Mandie; Duncan, James; McKinstry, Robert C
2015-01-01
The marked increase in radiation exposure from medical imaging, especially in children, has caused considerable alarm and spurred efforts to preserve the benefits but reduce the risks of imaging. Applying the principles of the Image Gently campaign, data-driven process and quality improvement techniques such as process mapping and flowcharting, cause-and-effect diagrams, Pareto analysis, statistical process control (control charts), failure mode and effects analysis, "lean" or Six Sigma methodology, and closed feedback loops led to a multiyear program that has reduced overall computed tomographic (CT) examination volume by more than fourfold and concurrently decreased radiation exposure per CT study without compromising diagnostic utility. This systematic approach involving education, streamlining access to magnetic resonance imaging and ultrasonography, auditing with comparison with benchmarks, applying modern CT technology, and revising CT protocols has led to a more than twofold reduction in CT radiation exposure between 2005 and 2012 for patients at the authors' institution while maintaining diagnostic utility. (©)RSNA, 2015.
NASA Astrophysics Data System (ADS)
Marston, B. K.; Bishop, M. P.; Shroder, J. F.
2009-12-01
Digital terrain analysis of mountain topography is widely utilized for mapping landforms, assessing the role of surface processes in landscape evolution, and estimating the spatial variation of erosion. Numerous geomorphometry techniques exist to characterize terrain surface parameters, although their utility to characterize the spatial hierarchical structure of the topography and permit an assessment of the erosion/tectonic impact on the landscape is very limited due to scale and data integration issues. To address this problem, we apply scale-dependent geomorphometric and object-oriented analyses to characterize the hierarchical spatial structure of mountain topography. Specifically, we utilized a high resolution digital elevation model to characterize complex topography in the Shimshal Valley in the Western Himalaya of Pakistan. To accomplish this, we generate terrain objects (geomorphological features and landform) including valley floors and walls, drainage basins, drainage network, ridge network, slope facets, and elemental forms based upon curvature. Object-oriented analysis was used to characterize object properties accounting for object size, shape, and morphometry. The spatial overlay and integration of terrain objects at various scales defines the nature of the hierarchical organization. Our results indicate that variations in the spatial complexity of the terrain hierarchical organization is related to the spatio-temporal influence of surface processes and landscape evolution dynamics. Terrain segmentation and the integration of multi-scale terrain information permits further assessment of process domains and erosion, tectonic impact potential, and natural hazard potential. We demonstrate this with landform mapping and geomorphological assessment examples.
Technical Standards for Command and Control Information Systems (CCISs) and Information Technology
1994-02-01
formatting, transmitting, receiving, and processing imagery and imagery-related information. The N1TFS is in essence the suite of individual standards...also known as Limited Operational Capability-Europe) and the German Joint Analysis System Military Intelligence ( JASMIN ). Among the approaches being... essence , the other systems utilize a one-level address space where addressing consists of identifying the fire support unit. However, AFATDS utilizes a two
AIRSAR Automated Web-based Data Processing and Distribution System
NASA Technical Reports Server (NTRS)
Chu, Anhua; vanZyl, Jakob; Kim, Yunjin; Lou, Yunling; Imel, David; Tung, Wayne; Chapman, Bruce; Durden, Stephen
2005-01-01
In this paper, we present an integrated, end-to-end synthetic aperture radar (SAR) processing system that accepts data processing requests, submits processing jobs, performs quality analysis, delivers and archives processed data. This fully automated SAR processing system utilizes database and internet/intranet web technologies to allow external users to browse and submit data processing requests and receive processed data. It is a cost-effective way to manage a robust SAR processing and archival system. The integration of these functions has reduced operator errors and increased processor throughput dramatically.
Corona, Andrea; Ambye-Jensen, Morten; Vega, Giovanna Croxatto; Hauschild, Michael Zwicky; Birkved, Morten
2018-09-01
The Green biorefinery (GBR) is a biorefinery concept that converts fresh biomass into value-added products. The present study combines a Process Flowsheet Simulation (PFS) and Life Cycle Assessment (LCA) to evaluate the technical and environmental performance of different GBR configurations and the cascading utilization of the GBR output. The GBR configurations considered in this study, test alternatives in the three main steps of green-biorefining: fractionation, precipitation, and protein separation. The different cascade utilization alternatives analyse different options for press-pulp utilization, and the LCA results show that the environmental profile of the GBR is highly affected by the utilization of the press-pulp and thus by the choice of conventional product replaced by the press-pulp. Furthermore, scenario analysis of different GBR configurations shows that higher benefits can be achieved by increasing product yields rather than lowering energy consumption. Green biorefining is shown to be an interesting biorefining concept, especially in a Danish context. Biorefining of green biomass is technically feasible and can bring environmental savings, when compared to conventional production methods. However, the savings will be determined by the processing involved in each conversion stage and on the cascade utilization of the different platform products. Copyright © 2018 Elsevier B.V. All rights reserved.
Design and testing of high temperature micro-ORC test stand using Siloxane as working fluid
NASA Astrophysics Data System (ADS)
Turunen-Saaresti, Teemu; Uusitalo, Antti; Honkatukia, Juha
2017-03-01
Organic Rankine Cycle is a mature technology for many applications e.g. biomass power plants, waste heat recovery and geothermal power for larger power capacity. Recently more attention is paid on an ORC utilizing high temperature heat with relatively low power. One of the attractive applications of such ORCs would be utilization of waste heat of exhaust gas of combustion engines in stationary and mobile applications. In this paper, a design procedure of the ORC process is described and discussed. The analysis of the major components of the process, namely the evaporator, recuperator, and turbogenerator is done. Also preliminary experimental results of an ORC process utilizing high temperature exhaust gas heat and using siloxane MDM as a working fluid are presented and discussed. The turbine type utilized in the turbogenerator is a radial inflow turbine and the turbogenerator consists of the turbine, the electric motor and the feed pump. Based on the results, it was identified that the studied system is capable to generate electricity from the waste heat of exhaust gases and it is shown that high molecular weight and high critical temperature fluids as the working fluids can be utilized in high-temperature small-scale ORC applications. 5.1 kW of electric power was generated by the turbogenerator.
A Study on Low-Cost Case Hardening of Mild and Alloy Steels Utilizing Cassava Leaf Media
NASA Astrophysics Data System (ADS)
Gordon, Renee Erica
Conventional case hardening processes have major drawbacks in being expensive and hazardous to perform. A novel cyaniding technique has been developed to case harden steel which involves the use of cassava leaf. Cassava is ideal for use in this process as it contains varying degrees of cyanogenic glucoside (15-1000 mg of HCN per kg of cassava). The entire hardening process involves direct thermal decomposition of the HCN, which produced C and N gas that then diffused into the steel creating a hardened surface. Pulverized cassava leaf was involved in the pack-cyaniding of AISI 1018 and Nitralloy 135 within three varying process atmospheres. The use of barium carbonate (BaCO3) as an energizer was employed at the high temperature regime while barium chloride (BaCl2) was utilized at low temperatures. Vickers microhardness testing, microstructural characterization, and diffraction techniques were utilized for analysis. While no improvement was observed at low temperatures, processing within the high temperature regime showed significant hardening. The addition of BaCO3 to pulverized cassava leaf accelerated the hardening process by substantially increasing the resident surface microhardness while generating a shallow case layer distance. Diffusion theory was used to identify changes experienced with the variation in parameters. The presence of barium carbonate during processing decreased the diffusivity of hardening agents. This manifested in a very large, initial mass transfer of diffusing species localized in the case region followed by a minimum of any further increase in case depths, even as treatment time intervals were increased. The level of influence each parameter delivered was assessed using stepwise regression analysis and a unified model was constructed.
Optical/thermal analysis methodology for a space-qualifiable RTP furnace
NASA Technical Reports Server (NTRS)
Bugby, D.; Dardarian, S.; Cole, E.
1993-01-01
A methodology to predict the coupled optical/thermal performance of a reflective cavity heating system was developed and a laboratory test to verify the method was carried out. The procedure was utilized to design a rapid thermal processing (RTP) furnace for the Robot-Operated Material Processing in Space (ROMPS) Program which is a planned STS HH-G canister experiment involving robotics and material processing in microgravity. The laboratory test employed a tungsten-halogen reflector/lamp to heat thin, p-type silicon wafers. Measurements instrumentation consisted of 5-mil Pt/Pt-Rh thermocouples and an optical pyrometer. The predicted results, utilizing an optical ray-tracing program and a lumped-capacitance thermal analyzer, showed good agreement with the measured data for temperatures exceeding 1300 C.
Yang, Tao; Sezer, Hayri; Celik, Ismail B.; ...
2015-06-02
In the present paper, a physics-based procedure combining experiments and multi-physics numerical simulations is developed for overall analysis of SOFCs operational diagnostics and performance predictions. In this procedure, essential information for the fuel cell is extracted first by utilizing empirical polarization analysis in conjunction with experiments and refined by multi-physics numerical simulations via simultaneous analysis and calibration of polarization curve and impedance behavior. The performance at different utilization cases and operating currents is also predicted to confirm the accuracy of the proposed model. It is demonstrated that, with the present electrochemical model, three air/fuel flow conditions are needed to producemore » a set of complete data for better understanding of the processes occurring within SOFCs. After calibration against button cell experiments, the methodology can be used to assess performance of planar cell without further calibration. The proposed methodology would accelerate the calibration process and improve the efficiency of design and diagnostics.« less
NASA Astrophysics Data System (ADS)
Abbate, Agostino; Nayak, A.; Koay, J.; Roy, R. J.; Das, Pankaj K.
1996-03-01
The wavelet transform (WT) has been used to study the nonstationary information in the electroencephalograph (EEG) as an aid in determining the anesthetic depth. A complex analytic mother wavelet is utilized to obtain the time evolution of the various spectral components of the EEG signal. The technique is utilized for the detection and spectral analysis of transient and background processes in the awake and asleep states. It can be observed that the response of both states before the application of the stimulus is similar in amplitude but not in spectral contents, which suggests a background activity of the brain. The brain reacts to the external stimulus in two different modes depending on the state of consciousness of the subject. In the case of awake state, there is an evident increase in response, while for the sleep state a reduction in this activity is observed. This analysis seems to suggest that the brain has an ongoing background process that monitors external stimulus in both the sleep and awake states.
Economics of human performance and systems total ownership cost.
Onkham, Wilawan; Karwowski, Waldemar; Ahram, Tareq Z
2012-01-01
Financial costs of investing in people is associated with training, acquisition, recruiting, and resolving human errors have a significant impact on increased total ownership costs. These costs can also affect the exaggerate budgets and delayed schedules. The study of human performance economical assessment in the system acquisition process enhances the visibility of hidden cost drivers which support program management informed decisions. This paper presents the literature review of human total ownership cost (HTOC) and cost impacts on overall system performance. Economic value assessment models such as cost benefit analysis, risk-cost tradeoff analysis, expected value of utility function analysis (EV), growth readiness matrix, multi-attribute utility technique, and multi-regressions model were introduced to reflect the HTOC and human performance-technology tradeoffs in terms of the dollar value. The human total ownership regression model introduces to address the influencing human performance cost component measurement. Results from this study will increase understanding of relevant cost drivers in the system acquisition process over the long term.
AOIPS 3 User's guide. Volume 1: Overview and software utilization
NASA Technical Reports Server (NTRS)
Schotz, S. S.; Negri, A. J.; Robinson, W.
1989-01-01
This is Volume I of the Atmospheric and Oceanographic Information Processing System (AOIPS) User's Guide. AOIPS 3 is the version of the AOIPS software as of April 1989. The AOIPS software was developed jointly by the Goddard Space Flight Center and General Sciences Corporation. Volume 1 is intended to provide the user with an overall guide to the AOIPS system. It introduces the user to AOIPS system concepts, explains how programs are related and the necessary order of program execution, and provides brief descriptions derived from on-line help for every AOIPS program. It is intended to serve as a reference for information such as: program function, inmput/output variable descriptions, program limitations, etc. AOIPS is an interactive meteorological processing system with capabilities to ingest and analyze the many types of meteorological data. AOIPS includes several applications in areas of relevance to meteorological research. AOIPS is partitioned into four applications components: satellite data analysis, radar data analysis, aircraft data analysis, and utilities.
NASA Technical Reports Server (NTRS)
Castruccio, P. A.; Loats, H. L., Jr.
1975-01-01
An analysis of current computer usage by major water resources users was made to determine the trends of usage and costs for the principal hydrologic users/models. The laws and empirical relationships governing the growth of the data processing loads were described and applied to project the future data loads. Data loads for ERTS CCT image processing were computed and projected through the 1985 era. The analysis showns significant impact due to the utilization and processing of ERTS CCT's data.
ERIC Educational Resources Information Center
Mun, Eun Young; von Eye, Alexander; Bates, Marsha E.; Vaschillo, Evgeny G.
2008-01-01
Model-based cluster analysis is a new clustering procedure to investigate population heterogeneity utilizing finite mixture multivariate normal densities. It is an inferentially based, statistically principled procedure that allows comparison of nonnested models using the Bayesian information criterion to compare multiple models and identify the…
Heuristic Task Analysis on E-Learning Course Development: A Formative Research Study
ERIC Educational Resources Information Center
Lee, Ji-Yeon; Reigeluth, Charles M.
2009-01-01
Utilizing heuristic task analysis (HTA), a method developed for eliciting, analyzing, and representing expertise in complex cognitive tasks, a formative research study was conducted on the task of e-learning course development to further improve the HTA process. Three instructional designers from three different post-secondary institutions in the…
ERIC Educational Resources Information Center
Py, Bernard
1984-01-01
It is suggested that it is not between two languages that transfers and interference occur, but within the learner. The learner mediates and constructs this relationship according to acquisition operations, processes, strategies, and stages that contrastive analysis, despite its utility, can neither account for nor predict. (MSE)
External insulation of electrified railway and energy saving analysis
NASA Astrophysics Data System (ADS)
Dun, Xiaohong
2018-04-01
Through the analysis of the formation process of insulator surface fouling and the cause of fouling of the insulator, the electrified railway was explored to utilize the coating material on the surface of the insulator to achieve the effect of flashover prevention. At the same time the purpose of energy conservation can be achieved.
A risk-based approach to management of leachables utilizing statistical analysis of extractables.
Stults, Cheryl L M; Mikl, Jaromir; Whelehan, Oliver; Morrical, Bradley; Duffield, William; Nagao, Lee M
2015-04-01
To incorporate quality by design concepts into the management of leachables, an emphasis is often put on understanding the extractable profile for the materials of construction for manufacturing disposables, container-closure, or delivery systems. Component manufacturing processes may also impact the extractable profile. An approach was developed to (1) identify critical components that may be sources of leachables, (2) enable an understanding of manufacturing process factors that affect extractable profiles, (3) determine if quantitative models can be developed that predict the effect of those key factors, and (4) evaluate the practical impact of the key factors on the product. A risk evaluation for an inhalation product identified injection molding as a key process. Designed experiments were performed to evaluate the impact of molding process parameters on the extractable profile from an ABS inhaler component. Statistical analysis of the resulting GC chromatographic profiles identified processing factors that were correlated with peak levels in the extractable profiles. The combination of statistically significant molding process parameters was different for different types of extractable compounds. ANOVA models were used to obtain optimal process settings and predict extractable levels for a selected number of compounds. The proposed paradigm may be applied to evaluate the impact of material composition and processing parameters on extractable profiles and utilized to manage product leachables early in the development process and throughout the product lifecycle.
Comparative Analysis of Processes for Recovery of Rare Earths from Bauxite Residue
NASA Astrophysics Data System (ADS)
Borra, Chenna Rao; Blanpain, Bart; Pontikes, Yiannis; Binnemans, Koen; Van Gerven, Tom
2016-11-01
Environmental concerns and lack of space suggest that the management of bauxite residue needs to be re-adressed. The utilization of the residue has thus become a topic high on the agenda for both academia and industry, yet, up to date, it is only rarely used. Nonetheless, recovery of rare earth elements (REEs) with or without other metals from bauxite residue, and utilization of the left-over residue in other applications like building materials may be a viable alternative to storage. Hence, different processes developed by the authors for recovery of REEs and other metals from bauxite residue were compared. In this study, preliminary energy and cost analyses were carried out to assess the feasibility of the processes. These analyses show that the combination of alkali roasting-smelting-quenching-leaching is a promising process for the treatment of bauxite residue and that it is justified to study this process at a pilot scale.
Hybrid Cascading Outage Analysis of Extreme Events with Optimized Corrective Actions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vallem, Mallikarjuna R.; Vyakaranam, Bharat GNVSR; Holzer, Jesse T.
2017-10-19
Power system are vulnerable to extreme contingencies (like an outage of a major generating substation) that can cause significant generation and load loss and can lead to further cascading outages of other transmission facilities and generators in the system. Some cascading outages are seen within minutes following a major contingency, which may not be captured exclusively using the dynamic simulation of the power system. The utilities plan for contingencies either based on dynamic or steady state analysis separately which may not accurately capture the impact of one process on the other. We address this gap in cascading outage analysis bymore » developing Dynamic Contingency Analysis Tool (DCAT) that can analyze hybrid dynamic and steady state behavior of the power system, including protection system models in dynamic simulations, and simulating corrective actions in post-transient steady state conditions. One of the important implemented steady state processes is to mimic operator corrective actions to mitigate aggravated states caused by dynamic cascading. This paper presents an Optimal Power Flow (OPF) based formulation for selecting corrective actions that utility operators can take during major contingency and thus automate the hybrid dynamic-steady state cascading outage process. The improved DCAT framework with OPF based corrective actions is demonstrated on IEEE 300 bus test system.« less
Processing methods for differential analysis of LC/MS profile data
Katajamaa, Mikko; Orešič, Matej
2005-01-01
Background Liquid chromatography coupled to mass spectrometry (LC/MS) has been widely used in proteomics and metabolomics research. In this context, the technology has been increasingly used for differential profiling, i.e. broad screening of biomolecular components across multiple samples in order to elucidate the observed phenotypes and discover biomarkers. One of the major challenges in this domain remains development of better solutions for processing of LC/MS data. Results We present a software package MZmine that enables differential LC/MS analysis of metabolomics data. This software is a toolbox containing methods for all data processing stages preceding differential analysis: spectral filtering, peak detection, alignment and normalization. Specifically, we developed and implemented a new recursive peak search algorithm and a secondary peak picking method for improving already aligned results, as well as a normalization tool that uses multiple internal standards. Visualization tools enable comparative viewing of data across multiple samples. Peak lists can be exported into other data analysis programs. The toolbox has already been utilized in a wide range of applications. We demonstrate its utility on an example of metabolic profiling of Catharanthus roseus cell cultures. Conclusion The software is freely available under the GNU General Public License and it can be obtained from the project web page at: . PMID:16026613
Processing methods for differential analysis of LC/MS profile data.
Katajamaa, Mikko; Oresic, Matej
2005-07-18
Liquid chromatography coupled to mass spectrometry (LC/MS) has been widely used in proteomics and metabolomics research. In this context, the technology has been increasingly used for differential profiling, i.e. broad screening of biomolecular components across multiple samples in order to elucidate the observed phenotypes and discover biomarkers. One of the major challenges in this domain remains development of better solutions for processing of LC/MS data. We present a software package MZmine that enables differential LC/MS analysis of metabolomics data. This software is a toolbox containing methods for all data processing stages preceding differential analysis: spectral filtering, peak detection, alignment and normalization. Specifically, we developed and implemented a new recursive peak search algorithm and a secondary peak picking method for improving already aligned results, as well as a normalization tool that uses multiple internal standards. Visualization tools enable comparative viewing of data across multiple samples. Peak lists can be exported into other data analysis programs. The toolbox has already been utilized in a wide range of applications. We demonstrate its utility on an example of metabolic profiling of Catharanthus roseus cell cultures. The software is freely available under the GNU General Public License and it can be obtained from the project web page at: http://mzmine.sourceforge.net/.
Goodyear, Kimberly; Parasuraman, Raja; Chernyak, Sergey; de Visser, Ewart; Madhavan, Poornima; Deshpande, Gopikrishna; Krueger, Frank
2017-10-01
As society becomes more reliant on machines and automation, understanding how people utilize advice is a necessary endeavor. Our objective was to reveal the underlying neural associations during advice utilization from expert human and machine agents with fMRI and multivariate Granger causality analysis. During an X-ray luggage-screening task, participants accepted or rejected good or bad advice from either the human or machine agent framed as experts with manipulated reliability (high miss rate). We showed that the machine-agent group decreased their advice utilization compared to the human-agent group and these differences in behaviors during advice utilization could be accounted for by high expectations of reliable advice and changes in attention allocation due to miss errors. Brain areas involved with the salience and mentalizing networks, as well as sensory processing involved with attention, were recruited during the task and the advice utilization network consisted of attentional modulation of sensory information with the lingual gyrus as the driver during the decision phase and the fusiform gyrus as the driver during the feedback phase. Our findings expand on the existing literature by showing that misses degrade advice utilization, which is represented in a neural network involving salience detection and self-processing with perceptual integration.
Heterodyne laser spectroscopy system
Wyeth, Richard W.; Paisner, Jeffrey A.; Story, Thomas
1989-01-01
A heterodyne laser spectroscopy system utilizes laser heterodyne techniques for purposes of laser isotope separation spectroscopy, vapor diagnostics, processing of precise laser frequency offsets from a reference frequency and the like, and provides spectral analysis of a laser beam.
Causal modeling in international migration research: a methodological prolegomenon.
Papademetriou, D G; Hopple, G W
1982-10-01
The authors examine the value of using models to study the migration process. In particular, they demonstrate the potential utility of a partial least squares modeling approach to the causal analysis of international migration.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lines, Amanda M.; Nelson, Gilbert L.; Casella, Amanda J.
Microfluidic devices are a growing field with significant potential for application to small scale processing of solutions. Much like large scale processing, fast, reliable, and cost effective means of monitoring the streams during processing are needed. Here we apply a novel Micro-Raman probe to the on-line monitoring of streams within a microfluidic device. For either macro or micro scale process monitoring via spectroscopic response, there is the danger of interfering or confounded bands obfuscating results. By utilizing chemometric analysis, a form of multivariate analysis, species can be accurately quantified in solution despite the presence of overlapping or confounded spectroscopic bands.more » This is demonstrated on solutions of HNO 3 and NaNO 3 within micro-flow and microfluidic devices.« less
Electroless nickel – phosphorus coating on crab shell particles and its characterization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arulvel, S., E-mail: gs.arulvel.research@gmail.com; Elayaperumal, A.; Jagatheeshwaran, M.S.
Being hydrophilic material, crab shell particles have only a limited number of applications. It is, therefore, necessary to modify the surface of the crab shell particles. To make them useful ever for the applications, the main theme we proposed in this article is to utilize crab shell particles (CSP) with the core coated with nickel phosphorus (NiP) as a shell using the electroless coating process. For dealing with serious environmental problems, utilization of waste bio-shells is always an important factor to be considered. Chelating ability of crab shell particles eliminates the surface activation in this work proceeding to the coatingmore » process. The functional group, phase structure, microstructure, chemical composition and thermal analysis of CSP and NiP/CSP were characterized using Fourier transform infra-red spectroscopy (FTIR), x-ray diffraction analyzer (XRD), scanning electron microscope (SEM), energy-dispersive x-ray spectroscopy (EDS), and thermogravimetric analysis (TGA). The combination of an amorphous and crystalline structure was exhibited by CSP and NiP/CSP. NiP/CSP has shown a better thermal stability when compared to uncoated CSP. Stability test, adsorption test, and conductivity test were conducted for the study of adsorption behavior and conductivity of the particles. CSP presented a hydrophilic property in contrast to hydrophobic NiP/CSP. NiP/CSP presented a conductivity of about 44% greater compared to the CSP without any fluctuations. - Highlights: • Utilization of crab shell waste is focused on. • NiP coating on crab shell particle is fabricated using electroless process. • Thermal analysis, stability test, adsorption test and conductivity test were done. • Organic matrix of crab shell particle favors the coating process. • Results demonstrate the characterization of CSP core – NiP shell structure.« less
The Utility of Free Software for Gravity and Magnetic Advanced Data Processing
NASA Astrophysics Data System (ADS)
Grandis, Hendra; Dahrin, Darharta
2017-04-01
The lack of computational tools, i.e. software, often hinders the proper teaching and application of geophysical data processing in academic institutions in Indonesia. Although there are academic licensing options for commercial software, such options are still way beyond the financial capability of some academic institutions. Academic community members (both lecturers and students) are supposed to be creative and resourceful to overcome such situation. Therefore, capability for writing computer programs or codes is a necessity. However, there are also many computer programs and even software that are freely available on the internet. Generally, the utility of the freely distributed software is limited for demonstration only or for visualizing and exchanging data. The paper discusses the utility of Geosoft’s Oasis Montaj Viewer along with USGS GX programs that are available for free. Useful gravity and magnetic advanced data processing (i.e. gradient calculation, spectral analysis etc.) can be performed “correctly” without any approximation that sometimes leads to dubious results and interpretation.
Preliminary results from the High Speed Airframe Integration Research project
NASA Technical Reports Server (NTRS)
Coen, Peter G.; Sobieszczanski-Sobieski, Jaroslaw; Dollyhigh, Samuel M.
1992-01-01
A review is presented of the accomplishment of the near term objectives of developing an analysis system and optimization methods during the first year of the NASA Langley High Speed Airframe Integration Research (HiSAIR) project. The characteristics of a Mach 3 HSCT transport have been analyzed utilizing the newly developed process. In addition to showing more detailed information about the aerodynamic and structural coupling for this type of vehicle, this exercise aided in further refining the data requirements for the analysis process.
The potential for industrial cogeneration development by 1990
NASA Astrophysics Data System (ADS)
1981-07-01
The cogeneration study focused on five industries that constitute three quarters of industrial steam demand: pulp and paper, chemicals, petroleum refining, steel, and food processing. These industries use almost one fifth of the total energy consumed in the United States. The analysis reflected the investment and regulatory concerns in the United States. The analysis reflected the investment used by industrial and utility managers. Phone discussions were held with approximately 70 companies to verify and augment the process and energy use data for the five industries.
Applications of satellite image processing to the analysis of Amazonian cultural ecology
NASA Technical Reports Server (NTRS)
Behrens, Clifford A.
1991-01-01
This paper examines the application of satellite image processing towards identifying and comparing resource exploitation among indigenous Amazonian peoples. The use of statistical and heuristic procedures for developing land cover/land use classifications from Thematic Mapper satellite imagery will be discussed along with actual results from studies of relatively small (100 - 200 people) settlements. Preliminary research indicates that analysis of satellite imagery holds great potential for measuring agricultural intensification, comparing rates of tropical deforestation, and detecting changes in resource utilization patterns over time.
Belmartino, Susana
2014-04-01
This article presents a comparative analysis of the processes leading to health care reform in Argentina and in the USA. The core of the analysis centers on the ideological references utilized by advocates of the reform and the decision-making processes that support or undercut such proposals. The analysis begins with a historical summary of the issue in each country. The political process that led to the sanction of the Obama reform is then described. The text defends a hypothesis aiming to show that deficiencies in the institutional capacities of Argentina's decision-making bodies are a severe obstacle to attaining substantial changes in this area within the country.
No Cost – Low Cost Compressed Air System Optimization in Industry
NASA Astrophysics Data System (ADS)
Dharma, A.; Budiarsa, N.; Watiniasih, N.; Antara, N. G.
2018-04-01
Energy conservation is a systematic, integrated of effort, in order to preserve energy sources and improve energy utilization efficiency. Utilization of energy in efficient manner without reducing the energy usage it must. Energy conservation efforts are applied at all stages of utilization, from utilization of energy resources to final, using efficient technology, and cultivating an energy-efficient lifestyle. The most common way is to promote energy efficiency in the industry on end use and overcome barriers to achieve such efficiency by using system energy optimization programs. The facts show that energy saving efforts in the process usually only focus on replacing tools and not an overall system improvement effort. In this research, a framework of sustainable energy reduction work in companies that have or have not implemented energy management system (EnMS) will be conducted a systematic technical approach in evaluating accurately a compressed-air system and potential optimization through observation, measurement and verification environmental conditions and processes, then processing the physical quantities of systems such as air flow, pressure and electrical power energy at any given time measured using comparative analysis methods in this industry, to provide the potential savings of energy saving is greater than the component approach, with no cost to the lowest cost (no cost - low cost). The process of evaluating energy utilization and energy saving opportunities will provide recommendations for increasing efficiency in the industry and reducing CO2 emissions and improving environmental quality.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Frew, Bethany; Mai, Trieu; Krishnan, Venkat
2016-12-01
In this study, we use the National Renewable Energy Laboratory's (NREL's) Regional Energy Deployment System (ReEDS) capacity expansion model to estimate utility-scale photovoltaic (UPV) deployment trends from present day through 2030. The analysis seeks to inform the U.S. Bureau of Land Management's (BLM's) planning activities related to UPV development on federal lands in Nevada as part of the Resource Management Plan (RMP) revision for the Las Vegas and Pahrump field offices. These planning activities include assessing the demand for new or expanded additional Solar Energy Zones (SEZ), per the process outlined in BLM's Western Solar Plan process.
Quantification of Operational Risk Using A Data Mining
NASA Technical Reports Server (NTRS)
Perera, J. Sebastian
1999-01-01
What is Data Mining? - Data Mining is the process of finding actionable information hidden in raw data. - Data Mining helps find hidden patterns, trends, and important relationships often buried in a sea of data - Typically, automated software tools based on advanced statistical analysis and data modeling technology can be utilized to automate the data mining process
An Onto-Semiotic Analysis of Combinatorial Problems and the Solving Processes by University Students
ERIC Educational Resources Information Center
Godino, Juan D.; Batanero, Carmen; Roa, Rafael
2005-01-01
In this paper we describe an ontological and semiotic model for mathematical knowledge, using elementary combinatorics as an example. We then apply this model to analyze the solving process of some combinatorial problems by students with high mathematical training, and show its utility in providing a semiotic explanation for the difficulty of…
Abstract:This case study application provides discussion on a selected application of advanced concepts, included in the End of Asset Life Reinvestment decision-making process tool, using a utility practitioner’s data set. The tool provides step-by-step process guidance to the as...
Solar Energy Systems for Lunar Oxygen Generation
NASA Technical Reports Server (NTRS)
Colozza, Anthony J.; Heller, Richard S.; Wong, Wayne A.; Hepp, Aloysius F.
2010-01-01
An evaluation of several solar concentrator-based systems for producing oxygen from lunar regolith was performed. The systems utilize a solar concentrator mirror to provide thermal energy for the oxygen production process. Thermal energy to power a Stirling heat engine and photovoltaics are compared for the production of electricity. The electricity produced is utilized to operate the equipment needed in the oxygen production process. The initial oxygen production method utilized in the analysis is hydrogen reduction of ilmenite. Utilizing this method of oxygen production a baseline system design was produced. This baseline system had an oxygen production rate of 0.6 kg/hr with a concentrator mirror size of 5 m. Variations were performed on the baseline design to show how changes in the system size and process (rate) affected the oxygen production rate. An evaluation of the power requirements for a carbothermal lunar regolith reduction reactor has also been conducted. The reactor had a total power requirement between 8,320 to 9,961 W when producing 1000 kg/year of oxygen. The solar concentrator used to provide the thermal power (over 82 percent of the total energy requirement) would have a diameter of less than 4 m.
NASA Astrophysics Data System (ADS)
McNeese, L. E.
1981-12-01
The progress made during the period from July 1 through September 30 for the Oak Ridge National Laboratory research and development projects in support of the increased utilization of coal and other fossil fuels as sources of clean energy is reported. The following topics are discussed: coal conversion development, chemical research and development, materials technology, fossil energy materials program, liquefaction projects, component development, process analysis, environmental control technology, atmospheric fluidized bed combustion, underground coal gasification, coal preparation and waste utilization.
Standardization of pitch-range settings in voice acoustic analysis.
Vogel, Adam P; Maruff, Paul; Snyder, Peter J; Mundt, James C
2009-05-01
Voice acoustic analysis is typically a labor-intensive, time-consuming process that requires the application of idiosyncratic parameters tailored to individual aspects of the speech signal. Such processes limit the efficiency and utility of voice analysis in clinical practice as well as in applied research and development. In the present study, we analyzed 1,120 voice files, using standard techniques (case-by-case hand analysis), taking roughly 10 work weeks of personnel time to complete. The results were compared with the analytic output of several automated analysis scripts that made use of preset pitch-range parameters. After pitch windows were selected to appropriately account for sex differences, the automated analysis scripts reduced processing time of the 1,120 speech samples to less than 2.5 h and produced results comparable to those obtained with hand analysis. However, caution should be exercised when applying the suggested preset values to pathological voice populations.
Deformation processes in forging ceramics
NASA Technical Reports Server (NTRS)
Cannon, R. M.; Rhodes, W. H.
1972-01-01
The deformation processes involved in the forging of refractory ceramic oxides were investigated. A combination of mechanical testing and forging are utilized to investigate both the flow and fracture processes involved. An additional hemisphere forging was done which failed prematurely. Analysis and comparison with available fracture data for AL2O3 indicated possible causes of the failure. Examination of previous forgings indicated an increase in grain boundary cavitation with increasing strain.
von Ferber, L; Luciano, A; Köster, I; Krappweis, J
1992-11-01
Drugs in primary health care are often prescribed for nonrational reasons. Drug utilization research investigates the prescription of drugs with an eye to medical, social and economic causes and consequences of the prescribed drug's utilization. The results of this research show distinct differences in drug utilization in different age groups and between men and women. Indication and dosage appear irrational from a textbook point of view. This indicates nonpharmacological causes of drug utilization. To advice successfully changes for the better quality assessment groups of primary health care physicians get information about their established behavior by analysis of their prescriptions. The discussion and the comparisons in the group allow them to recognize their irrational prescribing and the social, psychological and economic reasons behind it. Guidelines for treatment are worked out which take into account the primary health care physician's situation. After a year with 6 meetings of the quality assessment groups the education process is evaluated by another drug utilization analysis on the basis of the physicians prescription. The evaluation shows a remarkable improvement of quality and cost effectiveness of the drug therapy of the participating physicians.
Toxic release consequence analysis tool (TORCAT) for inherently safer design plant.
Shariff, Azmi Mohd; Zaini, Dzulkarnain
2010-10-15
Many major accidents due to toxic release in the past have caused many fatalities such as the tragedy of MIC release in Bhopal, India (1984). One of the approaches is to use inherently safer design technique that utilizes inherent safety principle to eliminate or minimize accidents rather than to control the hazard. This technique is best implemented in preliminary design stage where the consequence of toxic release can be evaluated and necessary design improvements can be implemented to eliminate or minimize the accidents to as low as reasonably practicable (ALARP) without resorting to costly protective system. However, currently there is no commercial tool available that has such capability. This paper reports on the preliminary findings on the development of a prototype tool for consequence analysis and design improvement via inherent safety principle by utilizing an integrated process design simulator with toxic release consequence analysis model. The consequence analysis based on the worst-case scenarios during process flowsheeting stage were conducted as case studies. The preliminary finding shows that toxic release consequences analysis tool (TORCAT) has capability to eliminate or minimize the potential toxic release accidents by adopting the inherent safety principle early in preliminary design stage. 2010 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Goldman, H.; Wolf, M.
1979-01-01
The manufacturing methods for photovoltaic solar energy utilization are assessed. Economic and technical data on the current front junction formation processes of gaseous diffusion and ion implantation are presented. Future proposals, including modifying gaseous diffusion and using ion implantation, to decrease the cost of junction formation are studied. Technology developments in current processes and an economic evaluation of the processes are included.
Reducing Door-to-Needle Times using Toyota’s Lean Manufacturing Principles and Value Stream Analysis
Ford, Andria L.; Williams, Jennifer A.; Spencer, Mary; McCammon, Craig; Khoury, Naim; Sampson, Tomoko; Panagos, Peter; Lee, Jin-Moo
2012-01-01
Background Earlier tPA treatment for acute ischemic stroke increases efficacy, prompting national efforts to reduce door-to-needle times (DNTs). We utilized lean process improvement methodology to develop a streamlined IV tPA protocol. Methods In early 2011, a multi-disciplinary team analyzed the steps required to treat acute ischemic stroke patients with IV tPA, utilizing value stream analysis (VSA). We directly compared the tPA-treated patients in the “pre-VSA” epoch to the “post-VSA” epoch with regard to baseline characteristics, protocol metrics, and clinical outcomes. Results The VSA revealed several tPA protocol inefficiencies: routing of patients to room, then to CT, then back to room; serial processing of work flow; and delays in waiting for lab results. On 3/1/2011, a new protocol incorporated changes to minimize delays: routing patients directly to head CT prior to patient room, utilizing parallel process work-flow, and implementing point-of-care labs. In the pre-and post-VSA epochs, 132 and 87 patients were treated with IV tPA, respectively. Compared to pre-VSA, DNTs and percent of patients treated ≤60 minutes from hospital arrival were improved in the post-VSA epoch: 60 min vs. 39 min (p<0.0001) and 52% vs. 78% (p<0.0001), respectively, with no change in symptomatic hemorrhage rate. Conclusions Lean process improvement methodology can expedite time-dependent stroke care, without compromising safety. PMID:23138440
Long term load forecasting accuracy in electric utility integrated resource planning
Carvallo, Juan Pablo; Larsen, Peter H.; Sanstad, Alan H.; ...
2018-05-23
Forecasts of electricity consumption and peak demand over time horizons of one or two decades are a key element in electric utilities’ meeting their core objective and obligation to ensure reliable and affordable electricity supplies for their customers while complying with a range of energy and environmental regulations and policies. These forecasts are an important input to integrated resource planning (IRP) processes involving utilities, regulators, and other stake-holders. Despite their importance, however, there has been little analysis of long term utility load forecasting accuracy. We conduct a retrospective analysis of long term load forecasts on twelve Western U. S. electricmore » utilities in the mid-2000s to find that most overestimated both energy consumption and peak demand growth. A key reason for this was the use of assumptions that led to an overestimation of economic growth. We find that the complexity of forecast methods and the accuracy of these forecasts are mildly correlated. In addition, sensitivity and risk analysis of load growth and its implications for capacity expansion were not well integrated with subsequent implementation. As a result, we review changes in the utilities load forecasting methods over the subsequent decade, and discuss the policy implications of long term load forecast inaccuracy and its underlying causes.« less
Long term load forecasting accuracy in electric utility integrated resource planning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carvallo, Juan Pablo; Larsen, Peter H.; Sanstad, Alan H.
Forecasts of electricity consumption and peak demand over time horizons of one or two decades are a key element in electric utilities’ meeting their core objective and obligation to ensure reliable and affordable electricity supplies for their customers while complying with a range of energy and environmental regulations and policies. These forecasts are an important input to integrated resource planning (IRP) processes involving utilities, regulators, and other stake-holders. Despite their importance, however, there has been little analysis of long term utility load forecasting accuracy. We conduct a retrospective analysis of long term load forecasts on twelve Western U. S. electricmore » utilities in the mid-2000s to find that most overestimated both energy consumption and peak demand growth. A key reason for this was the use of assumptions that led to an overestimation of economic growth. We find that the complexity of forecast methods and the accuracy of these forecasts are mildly correlated. In addition, sensitivity and risk analysis of load growth and its implications for capacity expansion were not well integrated with subsequent implementation. As a result, we review changes in the utilities load forecasting methods over the subsequent decade, and discuss the policy implications of long term load forecast inaccuracy and its underlying causes.« less
National Conference on Integrated Resource Planning: Proceedings
NASA Astrophysics Data System (ADS)
Until recently, state regulators have focused most of their attention on the development of least-cost or integrated resource planning (IRP) processes for electric utilities. A number of commissions are beginning to scrutinize the planning processes of local gas distribution companies (LDCs) because of the increased control that LDCs have over their purchased gas costs (as well as the associated risks) and because of questions surrounding the role and potential of gas end-use efficiency options. Traditionally, resource planning (LDCs) has concentrated on options for purchasing and storing gas. Integrated resource planning involves the creation of a process in which supply-side and demand-side options are integrated to create a resource mix that reliably satisfies customers' short-term and long-term energy service needs at the lowest cost. As applied to gas utilities, an integrated resource plan seeks to balance cost and reliability, and should not be interpreted simply as the search for lowest commodity costs. The National Association of Regulatory Utility Commissioners' (NARUC) Energy Conservation committee asked Lawrence Berkeley Laboratory (LBL) to survey state PUCs to determine the extent to which they have undertaken least cost planning for gas utilities. The survey included the following topics: status of state PUC least-cost planning regulations and practices for gas utilities; type and scope of natural gas DSM programs in effect, including fuel substitution; economic tests and analysis methods used to evaluate DSM programs; relationship between prudency reviews of gas utility purchasing practices and integrated resource planning; and key regulatory issues facing gas utilities during the next five years.
A Comparative Analysis of Extract, Transformation and Loading (ETL) Process
NASA Astrophysics Data System (ADS)
Runtuwene, J. P. A.; Tangkawarow, I. R. H. T.; Manoppo, C. T. M.; Salaki, R. J.
2018-02-01
The current growth of data and information occurs rapidly in varying amount and media. These types of development will eventually produce large number of data better known as the Big Data. Business Intelligence (BI) utilizes large number of data and information for analysis so that one can obtain important information. This type of information can be used to support decision-making process. In practice a process integrating existing data and information into data warehouse is needed. This data integration process is known as Extract, Transformation and Loading (ETL). In practice, many applications have been developed to carry out the ETL process, but selection which applications are more time, cost and power effective and efficient may become a challenge. Therefore, the objective of the study was to provide comparative analysis through comparison between the ETL process using Microsoft SQL Server Integration Service (SSIS) and one using Pentaho Data Integration (PDI).
OverPlotter: A Utility for Herschel Data Processing
NASA Astrophysics Data System (ADS)
Zhang, L.; Mei, Y.; Schulz, B.
2008-08-01
The OverPlotter utility is a GUI tool written in Java to support interactive data processing (DP) and analysis for the Herschel Space Observatory within the framework of the Herschel Common Science System (HCSS)(Wieprecht et al 2004). The tool expands upon the capabilities of the TableViewer (Zhang & Schulz 2005), providing now also the means to create additional overlays of several X/Y scatter plots within the same display area. These layers can be scaled and panned, either individually, or together as one graph. Visual comparison of data with different origins and units becomes much easier. The number of available layers is not limited, except by computer memory and performance. Presentation images can be easily created by adding annotations, labeling layers and setting colors. The tool will be very helpful especially in the early phases of Herschel data analysis, when a quick access to contents of data products is important.
Albrecht, Jessica; Kopietz, Rainer; Frasnelli, Johannes; Wiesmann, Martin; Hummel, Thomas; Lundström, Johan N.
2009-01-01
Almost every odor we encounter in daily life has the capacity to produce a trigeminal sensation. Surprisingly, few functional imaging studies exploring human neuronal correlates of intranasal trigeminal function exist, and results are to some degree inconsistent. We utilized activation likelihood estimation (ALE), a quantitative voxel-based meta-analysis tool, to analyze functional imaging data (fMRI/PET) following intranasal trigeminal stimulation with carbon dioxide (CO2), a stimulus known to exclusively activate the trigeminal system. Meta-analysis tools are able to identify activations common across studies, thereby enabling activation mapping with higher certainty. Activation foci of nine studies utilizing trigeminal stimulation were included in the meta-analysis. We found significant ALE scores, thus indicating consistent activation across studies, in the brainstem, ventrolateral posterior thalamic nucleus, anterior cingulate cortex, insula, precentral gyrus, as well as in primary and secondary somatosensory cortices – a network known for the processing of intranasal nociceptive stimuli. Significant ALE values were also observed in the piriform cortex, insula, and the orbitofrontal cortex, areas known to process chemosensory stimuli, and in association cortices. Additionally, the trigeminal ALE statistics were directly compared with ALE statistics originating from olfactory stimulation, demonstrating considerable overlap in activation. In conclusion, the results of this meta-analysis map the human neuronal correlates of intranasal trigeminal stimulation with high statistical certainty and demonstrate that the cortical areas recruited during the processing of intranasal CO2 stimuli include those outside traditional trigeminal areas. Moreover, through illustrations of the considerable overlap between brain areas that process trigeminal and olfactory information; these results demonstrate the interconnectivity of flavor processing. PMID:19913573
Link Analysis in the Mission Planning Lab
NASA Technical Reports Server (NTRS)
McCarthy, Jessica A.; Cervantes, Benjamin W.; Daugherty, Sarah C.; Arroyo, Felipe; Mago, Divyang
2011-01-01
The legacy communications link analysis software currently used at Wallops Flight Facility involves processes that are different for command destruct, radar, and telemetry. There is a clear advantage to developing an easy-to-use tool that combines all the processes in one application. Link Analysis in the Mission Planning Lab (MPL) uses custom software and algorithms integrated with Analytical Graphics Inc. Satellite Toolkit (AGI STK). The MPL link analysis tool uses pre/post-mission data to conduct a dynamic link analysis between ground assets and the launch vehicle. Just as the legacy methods do, the MPL link analysis tool calculates signal strength and signal- to-noise according to the accepted processes for command destruct, radar, and telemetry assets. Graphs and other custom data are generated rapidly in formats for reports and presentations. STK is used for analysis as well as to depict plume angles and antenna gain patterns in 3D. The MPL has developed two interfaces with the STK software (see figure). The first interface is an HTML utility, which was developed in Visual Basic to enhance analysis for plume modeling and to offer a more user friendly, flexible tool. A graphical user interface (GUI) written in MATLAB (see figure upper right-hand corner) is also used to quickly depict link budget information for multiple ground assets. This new method yields a dramatic decrease in the time it takes to provide launch managers with the required link budgets to make critical pre-mission decisions. The software code used for these two custom utilities is a product of NASA's MPL.
Agyeman-Yeboah, Joana; Korsah, Kwadwo Ameyaw; Okrah, Jane
2017-01-01
The nursing process is a tool that is recommended for use by all professional nurses working in Ghana, in order to provide nursing care. However, there is currently a limited use of this tool by nurses in Ghana. The purpose of this research study was to explore the various factors that influence the utilization of this nursing process. An exploratory descriptive qualitative-research design was employed. Ten participants were involved by using the purposive sampling method. A semi-structured interview guide was used to collect the data from the research participants; and the data were analysed by using content analysis. One main theme, with five subthemes, emerged from the analysis. It was found that there are factors, such as nurses not having a better understanding of the nursing process, whilst in school; the absence of the care plan in the ward, as well as the lack of adequate staff, with limited time being available for coping with contributed to the non-usage of the nursing process. We conclude that the clinical utilization of the Nursing process at the clinical setting is influenced by lack of understanding of Nurses on the Nursing process and care plan as well as lack of adequate nurses and time. We recommend that the care-plan form be made officially a part of the admission documents. Furthermore, the nursing administration should put measures in place to provide nurses with the needed resources to implement the nursing process. Additionally, they should ensure that the care-plan forms and other resources needed by the nurses are regularly and adequately provided. Nurses should further see the nursing process as a means of providing comprehensive care to their patients and addressing their specific problems. They should therefore make time despite their busy schedules to use it in order to improve quality of care and the image of nursing in Ghana.
Span graphics display utilities handbook, first edition
NASA Technical Reports Server (NTRS)
Gallagher, D. L.; Green, J. L.; Newman, R.
1985-01-01
The Space Physics Analysis Network (SPAN) is a computer network connecting scientific institutions throughout the United States. This network provides an avenue for timely, correlative research between investigators, in a multidisciplinary approach to space physics studies. An objective in the development of SPAN is to make available direct and simplified procedures that scientists can use, without specialized training, to exchange information over the network. Information exchanges include raw and processes data, analysis programs, correspondence, documents, and graphite images. This handbook details procedures that can be used to exchange graphic images over SPAN. The intent is to periodically update this handbook to reflect the constantly changing facilities available on SPAN. The utilities described within reflect an earnest attempt to provide useful descriptions of working utilities that can be used to transfer graphic images across the network. Whether graphic images are representative of satellite servations or theoretical modeling and whether graphics images are of device dependent or independent type, the SPAN graphics display utilities handbook will be the users guide to graphic image exchange.
2015-03-12
26 Table 3: Optometry Clinic Frequency Count... Optometry Clinic Frequency Count.................................................................. 86 Table 22: Probability Distribution Summary Table...Clinic, the Audiology Clinic, and the Optometry Clinic. Methodology Overview The overarching research goal is to identify feasible solutions to
ERIC Educational Resources Information Center
Wu, Ya-Ping; Mirenda, Pat; Wang, Hwa-Pey; Chen, Ming-Chung
2010-01-01
This case study describes the processes of functional analysis and modality assessment that were utilized to design a communication intervention for an adolescent with autism who engaged in loud and disruptive vocalizations for most of the school day. The functional analysis suggested that the vocalizations served both tangible and escape…
Closed Loop Requirements and Analysis Management
NASA Technical Reports Server (NTRS)
Lamoreaux, Michael; Verhoef, Brett
2015-01-01
Effective systems engineering involves the use of analysis in the derivation of requirements and verification of designs against those requirements. The initial development of requirements often depends on analysis for the technical definition of specific aspects of a product. Following the allocation of system-level requirements to a product's components, the closure of those requirements often involves analytical approaches to verify that the requirement criteria have been satisfied. Meanwhile, changes that occur in between these two processes need to be managed in order to achieve a closed-loop requirement derivation/verification process. Herein are presented concepts for employing emerging Team center capabilities to jointly manage requirements and analysis data such that analytical techniques are utilized to effectively derive and allocate requirements, analyses are consulted and updated during the change evaluation processes, and analyses are leveraged during the design verification process. Recommendations on concept validation case studies are also discussed.
NASA Technical Reports Server (NTRS)
Hepp, A. F.; Palaszewski, B. A.; Landis, G. A.; Jaworske, D. A.; Colozza, A. J.; Kulis, M. J.; Heller, R. S.
2015-01-01
As humanity begins to reach out into the solar system, it has become apparent that supporting a human or robotic presence in transit andor on station requires significant expendable resources including consumables (to support people), fuel, and convenient reliable power. Transporting all necessary expendables is inefficient, inconvenient, costly, and, in the final analysis, a complicating factor for mission planners and a significant source of potential failure modes. Over the past twenty-five years, beginning with the Space Exploration Initiative, researchers at the NASA Glenn Research Center (GRC), academic collaborators, and industrial partners have analyzed, researched, and developed successful solutions for the challenges posed by surviving and even thriving in the resource limited environment(s) presented by near-Earth space and non-terrestrial surface operations. In this retrospective paper, we highlight the efforts of the co-authors in resource simulation and utilization, materials processing and consumable(s) production, power systems and analysis, fuel storage and handling, propulsion systems, and mission operations. As we move forward in our quest to explore space using a resource-optimized approach, it is worthwhile to consider lessons learned relative to efficient utilization of the (comparatively) abundant natural resources and improving the sustainability (and environment) for life on Earth. We reconsider Lunar (and briefly Martian) resource utilization for potential colonization, and discuss next steps moving away from Earth.
NASA Technical Reports Server (NTRS)
Hepp, A. F.; Palaszewski, B. A.; Landis, G. A.; Jaworske, D. A.; Colozza, A. J.; Kulis, M. J.; Heller, Richard S.
2014-01-01
As humanity begins to reach out into the solar system, it has become apparent that supporting a human or robotic presence in transit and/or on station requires significant expendable resources including consumables (to support people), fuel, and convenient reliable power. Transporting all necessary expendables is inefficient, inconvenient, costly, and, in the final analysis, a complicating factor for mission planners and a significant source of potential failure modes. Over the past twenty-five years, beginning with the Space Exploration Initiative, researchers at the NASA Glenn Research Center (GRC), academic collaborators, and industrial partners have analyzed, researched, and developed successful solutions for the challenges posed by surviving and even thriving in the resource limited environment(s) presented by near-Earth space and non-terrestrial surface operations. In this retrospective paper, we highlight the efforts of the co-authors in resource simulation and utilization, materials processing and consumable(s) production, power systems and analysis, fuel storage and handling, propulsion systems, and mission operations. As we move forward in our quest to explore space using a resource-optimized approach, it is worthwhile to consider lessons learned relative to efficient utilization of the (comparatively) abundant natural resources and improving the sustainability (and environment) for life on Earth. We reconsider Lunar (and briefly Martian) resource utilization for potential colonization, and discuss next steps moving away from Earth.
Atmosphere Explorer control system software (version 1.0)
NASA Technical Reports Server (NTRS)
Villasenor, A.
1972-01-01
The basic design is described of the Atmosphere Explorer Control System (AECS) software used in the testing, integration, and flight contol of the AE spacecraft and experiments. The software performs several vital functions, such as issuing commands to the spacecraft and experiments, receiving and processing telemetry data, and allowing for extensive data processing by experiment analysis programs. The major processing sections are: executive control section, telemetry decommutation section, command generation section, and utility section.
Data processing 1: Advancements in machine analysis of multispectral data
NASA Technical Reports Server (NTRS)
Swain, P. H.
1972-01-01
Multispectral data processing procedures are outlined beginning with the data display process used to accomplish data editing and proceeding through clustering, feature selection criterion for error probability estimation, and sample clustering and sample classification. The effective utilization of large quantities of remote sensing data by formulating a three stage sampling model for evaluation of crop acreage estimates represents an improvement in determining the cost benefit relationship associated with remote sensing technology.
ERIC Educational Resources Information Center
Watkins, Arthur Noel
The purpose of this study was to identify and describe the decision-making processes in senior high schools that were implementing programs of individualized schooling. Field methodology, including interviews, observations, and analysis of documents, was used to gather data in six senior high schools of varying size located throughout the country,…
ERIC Educational Resources Information Center
Fabiszak, Malgorzata
2010-01-01
This paper is an application of Robert E. MacLaury's Vantage Theory (VT) to the analysis of real life spoken discourse. It utilizes Dennis R. Preston's (1994) modification of MacLaury's VT. It elucidates how cognitive processes of coordinate selection and combination contribute to the on-line construction of category membership in the abstract…
Field methods and data processing techniques associated with mapped inventory plots
William A. Bechtold; Stanley J. Zarnoch
1999-01-01
The U.S. Forest Inventory and Analysis (FIA) and Forest Health Monitoring (FHM) programs utilize a fixed-area mapped-plot design as the national standard for extensive forest inventories. The mapped-plot design is explained, as well as the rationale for its selection as the national standard. Ratio-of-means estimators am presented as a method to process data from...
SIGMA Release v1.2 - Capabilities, Enhancements and Fixes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mahadevan, Vijay; Grindeanu, Iulian R.; Ray, Navamita
In this report, we present details on SIGMA toolkit along with its component structure, capabilities, and feature additions in FY15, release cycles, and continuous integration process. These software processes along with updated documentation are imperative to successfully integrate and utilize in several applications including the SHARP coupled analysis toolkit for reactor core systems funded under the NEAMS DOE-NE program.
Desktop microsimulation: a tool to improve efficiency in the medical office practice.
Montgomery, James B; Linville, Beth A; Slonim, Anthony D
2013-01-01
Because the economic crisis in the United States continues to have an impact on healthcare organizations, industry leaders must optimize their decision making. Discrete-event computer simulation is a quality tool with a demonstrated track record of improving the precision of analysis for process redesign. However, the use of simulation to consolidate practices and design efficiencies into an unfinished medical office building was a unique task. A discrete-event computer simulation package was used to model the operations and forecast future results for four orthopedic surgery practices. The scenarios were created to allow an evaluation of the impact of process change on the output variables of exam room utilization, patient queue size, and staff utilization. The model helped with decisions regarding space allocation and efficient exam room use by demonstrating the impact of process changes in patient queues at check-in/out, x-ray, and cast room locations when compared to the status quo model. The analysis impacted decisions on facility layout, patient flow, and staff functions in this newly consolidated practice. Simulation was found to be a useful tool for process redesign and decision making even prior to building occupancy. © 2011 National Association for Healthcare Quality.
Eslick, John C.; Ng, Brenda; Gao, Qianwen; ...
2014-12-31
Under the auspices of the U.S. Department of Energy’s Carbon Capture Simulation Initiative (CCSI), a Framework for Optimization and Quantification of Uncertainty and Sensitivity (FOQUS) has been developed. This tool enables carbon capture systems to be rapidly synthesized and rigorously optimized, in an environment that accounts for and propagates uncertainties in parameters and models. FOQUS currently enables (1) the development of surrogate algebraic models utilizing the ALAMO algorithm, which can be used for superstructure optimization to identify optimal process configurations, (2) simulation-based optimization utilizing derivative free optimization (DFO) algorithms with detailed black-box process models, and (3) rigorous uncertainty quantification throughmore » PSUADE. FOQUS utilizes another CCSI technology, the Turbine Science Gateway, to manage the thousands of simulated runs necessary for optimization and UQ. Thus, this computational framework has been demonstrated for the design and analysis of a solid sorbent based carbon capture system.« less
High efficiency solution processed sintered CdTe nanocrystal solar cells: the role of interfaces.
Panthani, Matthew G; Kurley, J Matthew; Crisp, Ryan W; Dietz, Travis C; Ezzyat, Taha; Luther, Joseph M; Talapin, Dmitri V
2014-02-12
Solution processing of photovoltaic semiconducting layers offers the potential for drastic cost reduction through improved materials utilization and high device throughput. One compelling solution-based processing strategy utilizes semiconductor layers produced by sintering nanocrystals into large-grain semiconductors at relatively low temperatures. Using n-ZnO/p-CdTe as a model system, we fabricate sintered CdTe nanocrystal solar cells processed at 350 °C with power conversion efficiencies (PCE) as high as 12.3%. JSC of over 25 mA cm(-2) are achieved, which are comparable or higher than those achieved using traditional, close-space sublimated CdTe. We find that the VOC can be substantially increased by applying forward bias for short periods of time. Capacitance measurements as well as intensity- and temperature-dependent analysis indicate that the increased VOC is likely due to relaxation of an energetic barrier at the ITO/CdTe interface.
Theory of Collective Intelligence
NASA Technical Reports Server (NTRS)
Wolpert, David H.
2003-01-01
In this chapter an analysis of the behavior of an arbitrary (perhaps massive) collective of computational processes in terms of an associated "world" utility function is presented We concentrate on the situation where each process in the collective can be viewed as though it were striving to maximize its own private utility function. For such situations the central design issue is how to initialize/update the collective's structure, and in particular the private utility functions, so as to induce the overall collective to behave in a way that has large values of the world utility. Traditional "team game" approaches to this problem simply set each private utility function equal to the world utility function. The "Collective Intelligence" (COIN) framework is a semi-formal set of heuristics that recently have been used to construct private utility. functions that in many experiments have resulted in world utility values up to orders of magnitude superior to that ensuing from use of the team game utility. In this paper we introduce a formal mathematics for analyzing and designing collectives. We also use this mathematics to suggest new private utilities that should outperform the COIN heuristics in certain kinds of domains. In accompanying work we use that mathematics to explain previous experimental results concerning the superiority of COIN heuristics. In that accompanying work we also use the mathematics to make numerical predictions, some of which we then test. In this way these two papers establish the study of collectives as a proper science, involving theory, explanation of old experiments, prediction concerning new experiments, and engineering insights.
Anima: Modular Workflow System for Comprehensive Image Data Analysis
Rantanen, Ville; Valori, Miko; Hautaniemi, Sampsa
2014-01-01
Modern microscopes produce vast amounts of image data, and computational methods are needed to analyze and interpret these data. Furthermore, a single image analysis project may require tens or hundreds of analysis steps starting from data import and pre-processing to segmentation and statistical analysis; and ending with visualization and reporting. To manage such large-scale image data analysis projects, we present here a modular workflow system called Anima. Anima is designed for comprehensive and efficient image data analysis development, and it contains several features that are crucial in high-throughput image data analysis: programing language independence, batch processing, easily customized data processing, interoperability with other software via application programing interfaces, and advanced multivariate statistical analysis. The utility of Anima is shown with two case studies focusing on testing different algorithms developed in different imaging platforms and an automated prediction of alive/dead C. elegans worms by integrating several analysis environments. Anima is a fully open source and available with documentation at www.anduril.org/anima. PMID:25126541
NASA Astrophysics Data System (ADS)
Dostal, P.; Krasula, L.; Klima, M.
2012-06-01
Various image processing techniques in multimedia technology are optimized using visual attention feature of the human visual system. Spatial non-uniformity causes that different locations in an image are of different importance in terms of perception of the image. In other words, the perceived image quality depends mainly on the quality of important locations known as regions of interest. The performance of such techniques is measured by subjective evaluation or objective image quality criteria. Many state-of-the-art objective metrics are based on HVS properties; SSIM, MS-SSIM based on image structural information, VIF based on the information that human brain can ideally gain from the reference image or FSIM utilizing the low-level features to assign the different importance to each location in the image. But still none of these objective metrics utilize the analysis of regions of interest. We solve the question if these objective metrics can be used for effective evaluation of images reconstructed by processing techniques based on ROI analysis utilizing high-level features. In this paper authors show that the state-of-the-art objective metrics do not correlate well with subjective evaluation while the demosaicing based on ROI analysis is used for reconstruction. The ROI were computed from "ground truth" visual attention data. The algorithm combining two known demosaicing techniques on the basis of ROI location is proposed to reconstruct the ROI in fine quality while the rest of image is reconstructed with low quality. The color image reconstructed by this ROI approach was compared with selected demosaicing techniques by objective criteria and subjective testing. The qualitative comparison of the objective and subjective results indicates that the state-of-the-art objective metrics are still not suitable for evaluation image processing techniques based on ROI analysis and new criteria is demanded.
Robust analysis of semiparametric renewal process models
Lin, Feng-Chang; Truong, Young K.; Fine, Jason P.
2013-01-01
Summary A rate model is proposed for a modulated renewal process comprising a single long sequence, where the covariate process may not capture the dependencies in the sequence as in standard intensity models. We consider partial likelihood-based inferences under a semiparametric multiplicative rate model, which has been widely studied in the context of independent and identical data. Under an intensity model, gap times in a single long sequence may be used naively in the partial likelihood with variance estimation utilizing the observed information matrix. Under a rate model, the gap times cannot be treated as independent and studying the partial likelihood is much more challenging. We employ a mixing condition in the application of limit theory for stationary sequences to obtain consistency and asymptotic normality. The estimator's variance is quite complicated owing to the unknown gap times dependence structure. We adapt block bootstrapping and cluster variance estimators to the partial likelihood. Simulation studies and an analysis of a semiparametric extension of a popular model for neural spike train data demonstrate the practical utility of the rate approach in comparison with the intensity approach. PMID:24550568
Qiu, Shanshan; Wang, Jun; Gao, Liping
2014-07-09
An electronic nose (E-nose) and an electronic tongue (E-tongue) have been used to characterize five types of strawberry juices based on processing approaches (i.e., microwave pasteurization, steam blanching, high temperature short time pasteurization, frozen-thawed, and freshly squeezed). Juice quality parameters (vitamin C, pH, total soluble solid, total acid, and sugar/acid ratio) were detected by traditional measuring methods. Multivariate statistical methods (linear discriminant analysis (LDA) and partial least squares regression (PLSR)) and neural networks (Random Forest (RF) and Support Vector Machines) were employed to qualitative classification and quantitative regression. E-tongue system reached higher accuracy rates than E-nose did, and the simultaneous utilization did have an advantage in LDA classification and PLSR regression. According to cross-validation, RF has shown outstanding and indisputable performances in the qualitative and quantitative analysis. This work indicates that the simultaneous utilization of E-nose and E-tongue can discriminate processed fruit juices and predict quality parameters successfully for the beverage industry.
Dharmawan, Budi; Böcher, Michael; Krott, Max
2017-09-01
The success of scientific knowledge transfer depends on if the decision maker can transform the scientific advice into a policy that can be accepted by all involved actors. We use a science-policy interactions model called research-integration-utilization to observe the process of scientific knowledge transfer in the case of endangered mangroves in Segara Anakan, Indonesia. Scientific knowledge is produced within the scientific system (research), science-based solutions to problems are practically utilized by political actors (utilization), and important links between research and utilization must be made (integration). We looked for empirical evidence to test hypotheses about the research-integration-utilization model based on document analysis and expert interviews. Our study finds that the failures in knowledge transfer are caused by the inappropriate use of scientific findings. The district government is expected by presidential decree to only used scientifically sound recommendations as a prerequisite for designing the regulation. However, the district government prefers to implement their own solutions because they believe that they understand the solutions better than the researcher. In the process of integration, the researcher cannot be involved, since the selection of scientific recommendations here fully depends on the interests of the district government as the powerful ally.
Endangered Mangroves in Segara Anakan, Indonesia: Effective and Failed Problem-Solving Policy Advice
NASA Astrophysics Data System (ADS)
Dharmawan, Budi; Böcher, Michael; Krott, Max
2017-09-01
The success of scientific knowledge transfer depends on if the decision maker can transform the scientific advice into a policy that can be accepted by all involved actors. We use a science-policy interactions model called research-integration-utilization to observe the process of scientific knowledge transfer in the case of endangered mangroves in Segara Anakan, Indonesia. Scientific knowledge is produced within the scientific system (research), science-based solutions to problems are practically utilized by political actors (utilization), and important links between research and utilization must be made (integration). We looked for empirical evidence to test hypotheses about the research-integration-utilization model based on document analysis and expert interviews. Our study finds that the failures in knowledge transfer are caused by the inappropriate use of scientific findings. The district government is expected by presidential decree to only used scientifically sound recommendations as a prerequisite for designing the regulation. However, the district government prefers to implement their own solutions because they believe that they understand the solutions better than the researcher. In the process of integration, the researcher cannot be involved, since the selection of scientific recommendations here fully depends on the interests of the district government as the powerful ally.
Utility-preserving anonymization for health data publishing.
Lee, Hyukki; Kim, Soohyung; Kim, Jong Wook; Chung, Yon Dohn
2017-07-11
Publishing raw electronic health records (EHRs) may be considered as a breach of the privacy of individuals because they usually contain sensitive information. A common practice for the privacy-preserving data publishing is to anonymize the data before publishing, and thus satisfy privacy models such as k-anonymity. Among various anonymization techniques, generalization is the most commonly used in medical/health data processing. Generalization inevitably causes information loss, and thus, various methods have been proposed to reduce information loss. However, existing generalization-based data anonymization methods cannot avoid excessive information loss and preserve data utility. We propose a utility-preserving anonymization for privacy preserving data publishing (PPDP). To preserve data utility, the proposed method comprises three parts: (1) utility-preserving model, (2) counterfeit record insertion, (3) catalog of the counterfeit records. We also propose an anonymization algorithm using the proposed method. Our anonymization algorithm applies full-domain generalization algorithm. We evaluate our method in comparison with existence method on two aspects, information loss measured through various quality metrics and error rate of analysis result. With all different types of quality metrics, our proposed method show the lower information loss than the existing method. In the real-world EHRs analysis, analysis results show small portion of error between the anonymized data through the proposed method and original data. We propose a new utility-preserving anonymization method and an anonymization algorithm using the proposed method. Through experiments on various datasets, we show that the utility of EHRs anonymized by the proposed method is significantly better than those anonymized by previous approaches.
NASA Astrophysics Data System (ADS)
Radomski, Bartosz; Ćwiek, Barbara; Mróz, Tomasz M.
2017-11-01
The paper presents multicriteria decision aid analysis of the choice of PV installation providing electric energy to a public utility building. From the energy management point of view electricity obtained by solar radiation has become crucial renewable energy source. Application of PV installations may occur a profitable solution from energy, economic and ecologic point of view for both existing and newly erected buildings. Featured variants of PV installations have been assessed by multicriteria analysis based on ANP (Analytic Network Process) method. Technical, economical, energy and environmental criteria have been identified as main decision criteria. Defined set of decision criteria has an open character and can be modified in the dialog process between the decision-maker and the expert - in the present case, an expert in planning of development of energy supply systems. The proposed approach has been used to evaluate three variants of PV installation acceptable for existing educational building located in Poznań, Poland - the building of Faculty of Chemical Technology, Poznań University of Technology. Multi-criteria analysis based on ANP method and the calculation software Super Decisions has proven to be an effective tool for energy planning, leading to the indication of the recommended variant of PV installation in existing and newly erected public buildings. Achieved results show prospects and possibilities of rational renewable energy usage as complex solution to public utility buildings.
NASA Technical Reports Server (NTRS)
Mizell, Carolyn Barrett; Malone, Linda
2007-01-01
The development process for a large software development project is very complex and dependent on many variables that are dynamic and interrelated. Factors such as size, productivity and defect injection rates will have substantial impact on the project in terms of cost and schedule. These factors can be affected by the intricacies of the process itself as well as human behavior because the process is very labor intensive. The complex nature of the development process can be investigated with software development process models that utilize discrete event simulation to analyze the effects of process changes. The organizational environment and its effects on the workforce can be analyzed with system dynamics that utilizes continuous simulation. Each has unique strengths and the benefits of both types can be exploited by combining a system dynamics model and a discrete event process model. This paper will demonstrate how the two types of models can be combined to investigate the impacts of human resource interactions on productivity and ultimately on cost and schedule.
Sensitivity analysis of the add-on price estimate for the silicon web growth process
NASA Technical Reports Server (NTRS)
Mokashi, A. R.
1981-01-01
The web growth process, a silicon-sheet technology option, developed for the flat plate solar array (FSA) project, was examined. Base case data for the technical and cost parameters for the technical and commercial readiness phase of the FSA project are projected. The process add on price, using the base case data for cost parameters such as equipment, space, direct labor, materials and utilities, and the production parameters such as growth rate and run length, using a computer program developed specifically to do the sensitivity analysis with improved price estimation are analyzed. Silicon price, sheet thickness and cell efficiency are also discussed.
Characterization of Tactical Departure Scheduling in the National Airspace System
NASA Technical Reports Server (NTRS)
Capps, Alan; Engelland, Shawn A.
2011-01-01
This paper discusses and analyzes current day utilization and performance of the tactical departure scheduling process in the National Airspace System (NAS) to understand the benefits in improving this process. The analysis used operational air traffic data from over 1,082,000 flights during the month of January, 2011. Specific metrics included the frequency of tactical departure scheduling, site specific variances in the technology's utilization, departure time prediction compliance used in the tactical scheduling process and the performance with which the current system can predict the airborne slot that aircraft are being scheduled into from the airport surface. Operational data analysis described in this paper indicates significant room for improvement exists in the current system primarily in the area of reduced departure time prediction uncertainty. Results indicate that a significant number of tactically scheduled aircraft did not meet their scheduled departure slot due to departure time uncertainty. In addition to missed slots, the operational data analysis identified increased controller workload associated with tactical departures which were subject to traffic management manual re-scheduling or controller swaps. An analysis of achievable levels of departure time prediction accuracy as obtained by a new integrated surface and tactical scheduling tool is provided to assess the benefit it may provide as a solution to the identified shortfalls. A list of NAS facilities which are likely to receive the greatest benefit from the integrated surface and tactical scheduling technology are provided.
Modeling the Supply Process Using the Application of Selected Methods of Operational Analysis
NASA Astrophysics Data System (ADS)
Chovancová, Mária; Klapita, Vladimír
2017-03-01
Supply process is one of the most important enterprise activities. All raw materials, intermediate products and products, which are moved within enterprise, are the subject of inventory management and by their effective management significant improvement of enterprise position on the market can be achieved. For that reason, the inventory needs to be managed, monitored, evaluated and affected. The paper deals with utilizing the methods of the operational analysis in the field of inventory management in terms of achieving the economic efficiency and ensuring the particular customer's service level as well.
1992-04-27
spectrum analysis . 4. CONCLUSIONS * The nonthermal synthesis of crystalline nanoparticles of aluminum nitride, silicon carbide and silicon nitride is...51 R.E. Newnham, S.J. Jang, M. Xu, and F. Jones Theory of Microwave Interactions with Ceramic Mz terials .... 69 V.M. Kenkre An Analysis of the...the Performance of Microwave Process Systems Which Utilize High Q Cavities ............ 667 J.F. Gerling and G. Fournier Microwave Thermogravimetric
NEW GIS WATERSHED ANALYSIS TOOLS FOR SOIL CHARACTERIZATION AND EROSION AND SEDIMENTATION MODELING
A comprehensive procedure for computing soil erosion and sediment delivery metrics has been developed which utilizes a suite of automated scripts and a pair of processing-intensive executable programs operating on a personal computer platform.
Neuromorphic Computing for Very Large Test and Evaluation Data Analysis
2014-05-01
analysis and utilization of newly available hardware- based artificial neural network chips. These two aspects of the program are complementary. The...neuromorphic architectures research focused on long term disruptive technologies with high risk but revolutionary potential. The hardware- based neural...today. Overall, hardware- based neural processing research allows us to study the fundamental system and architectural issues relevant for employing
F. Christian Zinkhan; Thomas P. Holmes; D. Evan Mercer
1994-01-01
With conjoint analysis as its foundation, a practical approach for measuring the utility and dollar value of non-market outputs from southern forests is described and analyzed. The approach can be used in the process of evaluating alternative silvicultural and broader natural resource management plans when non-market as well as market outputs are recognized. When...
Optimizing Utilization of Detectors
2016-03-01
provide a quantifiable process to determine how much time should be allocated to each task sharing the same asset . This optimized expected time... allocation is calculated by numerical analysis and Monte Carlo simulation. Numerical analysis determines the expectation by involving an integral and...determines the optimum time allocation of the asset by repeatedly running experiments to approximate the expectation of the random variables. This
Sample extraction is one of the most important steps in arsenic speciation analysis of solid dietary samples. One of the problem areas in this analysis is the partial extraction of arsenicals from seafood samples. The partial extraction allows the toxicity of the extracted arse...
ERIC Educational Resources Information Center
Kimball, Ezekiel
2016-01-01
This paper utilizes a critical post-pragmatist epistemological lens in tandem with an extended case analysis to explore how student affairs professionals process truth claims related to student experience. Findings from the study, which include the limited usage of formal theory and the iterative reconstruction of informal theory, are used to…
ERIC Educational Resources Information Center
Pawade, Yogesh R.; Diwase, Dipti S.
2016-01-01
Item analysis of Multiple Choice Questions (MCQs) is the process of collecting, summarizing and utilizing information from students' responses to evaluate the quality of test items. Difficulty Index (p-value), Discrimination Index (DI) and Distractor Efficiency (DE) are the parameters which help to evaluate the quality of MCQs used in an…
Microbial Cellulose Utilization: Fundamentals and Biotechnology
Lynd, Lee R.; Weimer, Paul J.; van Zyl, Willem H.; Pretorius, Isak S.
2002-01-01
Fundamental features of microbial cellulose utilization are examined at successively higher levels of aggregation encompassing the structure and composition of cellulosic biomass, taxonomic diversity, cellulase enzyme systems, molecular biology of cellulase enzymes, physiology of cellulolytic microorganisms, ecological aspects of cellulase-degrading communities, and rate-limiting factors in nature. The methodological basis for studying microbial cellulose utilization is considered relative to quantification of cells and enzymes in the presence of solid substrates as well as apparatus and analysis for cellulose-grown continuous cultures. Quantitative description of cellulose hydrolysis is addressed with respect to adsorption of cellulase enzymes, rates of enzymatic hydrolysis, bioenergetics of microbial cellulose utilization, kinetics of microbial cellulose utilization, and contrasting features compared to soluble substrate kinetics. A biological perspective on processing cellulosic biomass is presented, including features of pretreated substrates and alternative process configurations. Organism development is considered for “consolidated bioprocessing” (CBP), in which the production of cellulolytic enzymes, hydrolysis of biomass, and fermentation of resulting sugars to desired products occur in one step. Two organism development strategies for CBP are examined: (i) improve product yield and tolerance in microorganisms able to utilize cellulose, or (ii) express a heterologous system for cellulose hydrolysis and utilization in microorganisms that exhibit high product yield and tolerance. A concluding discussion identifies unresolved issues pertaining to microbial cellulose utilization, suggests approaches by which such issues might be resolved, and contrasts a microbially oriented cellulose hydrolysis paradigm to the more conventional enzymatically oriented paradigm in both fundamental and applied contexts. PMID:12209002
An alternative respiratory sounds classification system utilizing artificial neural networks.
Oweis, Rami J; Abdulhay, Enas W; Khayal, Amer; Awad, Areen
2015-01-01
Computerized lung sound analysis involves recording lung sound via an electronic device, followed by computer analysis and classification based on specific signal characteristics as non-linearity and nonstationarity caused by air turbulence. An automatic analysis is necessary to avoid dependence on expert skills. This work revolves around exploiting autocorrelation in the feature extraction stage. All process stages were implemented in MATLAB. The classification process was performed comparatively using both artificial neural networks (ANNs) and adaptive neuro-fuzzy inference systems (ANFIS) toolboxes. The methods have been applied to 10 different respiratory sounds for classification. The ANN was superior to the ANFIS system and returned superior performance parameters. Its accuracy, specificity, and sensitivity were 98.6%, 100%, and 97.8%, respectively. The obtained parameters showed superiority to many recent approaches. The promising proposed method is an efficient fast tool for the intended purpose as manifested in the performance parameters, specifically, accuracy, specificity, and sensitivity. Furthermore, it may be added that utilizing the autocorrelation function in the feature extraction in such applications results in enhanced performance and avoids undesired computation complexities compared to other techniques.
Modeling the Structural Dynamic of Industrial Networks
NASA Astrophysics Data System (ADS)
Wilkinson, Ian F.; Wiley, James B.; Lin, Aizhong
Market systems consist of locally interacting agents who continuously pursue advantageous opportunities. Since the time of Adam Smith, a fundamental task of economics has been to understand how market systems develop and to explain their operation. During the intervening years, theory largely has stressed comparative statics analysis. Based on the assumptions of rational, utility or profit-maximizing agents, and negative, diminishing returns) feedback process, traditional economic analysis seeks to describe the, generally) unique state of an economy corresponding to an initial set of assumptions. The analysis is tatic in the sense that it does not describe the process by which an economy might get from one state to another.
Multidimensional Processing and Visual Rendering of Complex 3D Biomedical Images
NASA Technical Reports Server (NTRS)
Sams, Clarence F.
2016-01-01
The proposed technology uses advanced image analysis techniques to maximize the resolution and utility of medical imaging methods being used during spaceflight. We utilize COTS technology for medical imaging, but our applications require higher resolution assessment of the medical images than is routinely applied with nominal system software. By leveraging advanced data reduction and multidimensional imaging techniques utilized in analysis of Planetary Sciences and Cell Biology imaging, it is possible to significantly increase the information extracted from the onboard biomedical imaging systems. Year 1 focused on application of these techniques to the ocular images collected on ground test subjects and ISS crewmembers. Focus was on the choroidal vasculature and the structure of the optic disc. Methods allowed for increased resolution and quantitation of structural changes enabling detailed assessment of progression over time. These techniques enhance the monitoring and evaluation of crew vision issues during space flight.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Broderick, Robert Joseph; Quiroz, Jimmy Edward; Reno, Matthew J.
2015-11-01
The third solicitation of the California Solar Initiative (CSI) Research, Development, Demonstration and Deployment (RD&D) Program established by the California Public Utility Commission (CPUC) is supporting the Electric Power Research Institute (EPRI), National Renewable Energy Laboratory (NREL), and Sandia National Laboratories (SNL) with collaboration from Pacific Gas and Electric (PG&E), Southern California Edison (SCE), and San Diego Gas and Electric (SDG&E), in research to improve the Utility Application Review and Approval process for interconnecting distributed energy resources to the distribution system. Currently this process is the most time - consuming of any step on the path to generating power onmore » the distribution system. This CSI RD&D solicitation three project has completed the tasks of collecting data from the three utilities, clustering feeder characteristic data to attain representative feeders, detailed modeling of 16 representative feeders, analysis of PV impacts to those feeders, refinement of current screening processes, and validation of those suggested refinements. In this report each task is summarized to produce a final summary of all components of the overall project.« less
Analysis of real-time vibration data
Safak, E.
2005-01-01
In recent years, a few structures have been instrumented to provide continuous vibration data in real time, recording not only large-amplitude motions generated by extreme loads, but also small-amplitude motions generated by ambient loads. The main objective in continuous recording is to track any changes in structural characteristics, and to detect damage after an extreme event, such as an earthquake or explosion. The Fourier-based spectral analysis methods have been the primary tool to analyze vibration data from structures. In general, such methods do not work well for real-time data, because real-time data are mainly composed of ambient vibrations with very low amplitudes and signal-to-noise ratios. The long duration, linearity, and the stationarity of ambient data, however, allow us to utilize statistical signal processing tools, which can compensate for the adverse effects of low amplitudes and high noise. The analysis of real-time data requires tools and techniques that can be applied in real-time; i.e., data are processed and analyzed while being acquired. This paper presents some of the basic tools and techniques for processing and analyzing real-time vibration data. The topics discussed include utilization of running time windows, tracking mean and mean-square values, filtering, system identification, and damage detection.
Calcification-carbonation method for red mud processing.
Li, Ruibing; Zhang, Tingan; Liu, Yan; Lv, Guozhi; Xie, Liqun
2016-10-05
Red mud, the Bayer process residue, is generated from alumina industry and causes environmental problem. In this paper, a novel calcification-carbonation method that utilized a large amount of the Bayer process residue is proposed. Using this method, the red mud was calcified with lime to transform the silicon phase into hydrogarnet, and the alkali in red mud was recovered. Then, the resulting hydrogarnet was decomposed by CO2 carbonation, affording calcium silicate, calcium carbonate, and aluminum hydroxide. Alumina was recovered using an alkaline solution at a low temperature. The effects of the new process were analyzed by thermodynamics analysis and experiments. The extraction efficiency of the alumina and soda obtained from the red mud reached 49.4% and 96.8%, respectively. The new red mud with <0.3% alkali can be used in cement production. Using a combination of this method and cement production, the Bayer process red mud can be completely utilized. Copyright © 2016 Elsevier B.V. All rights reserved.
Development of Software for Automatic Analysis of Intervention in the Field of Homeopathy.
Jain, Rajesh Kumar; Goyal, Shagun; Bhat, Sushma N; Rao, Srinath; Sakthidharan, Vivek; Kumar, Prasanna; Sajan, Kannanaikal Rappayi; Jindal, Sameer Kumar; Jindal, Ghanshyam D
2018-05-01
To study the effect of homeopathic medicines (in higher potencies) in normal subjects, Peripheral Pulse Analyzer (PPA) has been used to record physiologic variability parameters before and after administration of the medicine/placebo in 210 normal subjects. Data have been acquired in seven rounds; placebo was administered in rounds 1 and 2 and medicine in potencies 6, 30, 200, 1 M, and 10 M was administered in rounds 3 to 7, respectively. Five different medicines in the said potencies were given to a group of around 40 subjects each. Although processing of data required human intervention, a software application has been developed to analyze the processed data and detect the response to eliminate the undue delay as well as human bias in subjective analysis. This utility named Automatic Analysis of Intervention in the Field of Homeopathy is run on the processed PPA data and the outcome has been compared with the manual analysis. The application software uses adaptive threshold based on statistics for detecting responses in contrast to fixed threshold used in manual analysis. The automatic analysis has detected 12.96% higher responses than subjective analysis. Higher response rates have been manually verified to be true positive. This indicates robustness of the application software. The automatic analysis software was run on another set of pulse harmonic parameters derived from the same data set to study cardiovascular susceptibility and 385 responses were detected in contrast to 272 of variability parameters. It was observed that 65% of the subjects, eliciting response, were common. This not only validates the software utility for giving consistent yield but also reveals the certainty of the response. This development may lead to electronic proving of homeopathic medicines (e-proving).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Purcupile, J.C.
The purpose of this study is to apply the methodologies developed in the Energy Conservation in Coal Conversion August, 1977 Progress Report - Contract No. EY77S024196 - to an energy efficient, near-term coal conversion process design, and to develop additional, general techniques for studying energy conservation and utilization in coal conversion processes. The process selected for study was the Ralph M. Parsons Company of Pasadena, California ''Oil/Gas Complex, Conceptual Design/Economic Analysis'' as described in R and D Report No. 114 - Interim Report No. 4, published March, 1977, ERDA Contract No. E(49-18)-1975. Thirteen papers representing possible alternative methods of energymore » conservation or waste heat utilization have been entered individually into EDB and ERA. (LTN)« less
Wen J. Wang; Hong S. He; Martin A. Spetich; Stephen R. Shifley; Frank R. Thompson III; David R. Larsen; Jacob S. Fraser; Jian Yang
2013-01-01
Two challenges confronting forest landscape models (FLMs) are how to simulate fine, standscale processes while making large-scale (i.e., .107 ha) simulation possible, and how to take advantage of extensive forest inventory data such as U.S. Forest Inventory and Analysis (FIA) data to initialize and constrain model parameters. We present the LANDIS PRO model that...
Qiao, Yuanhua; Keren, Nir; Mannan, M Sam
2009-08-15
Risk assessment and management of transportation of hazardous materials (HazMat) require the estimation of accident frequency. This paper presents a methodology to estimate hazardous materials transportation accident frequency by utilizing publicly available databases and expert knowledge. The estimation process addresses route-dependent and route-independent variables. Negative binomial regression is applied to an analysis of the Department of Public Safety (DPS) accident database to derive basic accident frequency as a function of route-dependent variables, while the effects of route-independent variables are modeled by fuzzy logic. The integrated methodology provides the basis for an overall transportation risk analysis, which can be used later to develop a decision support system.
Biological reduction of chlorinated solvents: Batch-scale geochemical modeling
NASA Astrophysics Data System (ADS)
Kouznetsova, Irina; Mao, Xiaomin; Robinson, Clare; Barry, D. A.; Gerhard, Jason I.; McCarty, Perry L.
2010-09-01
Simulation of biodegradation of chlorinated solvents in dense non-aqueous phase liquid (DNAPL) source zones requires a model that accounts for the complexity of processes involved and that is consistent with available laboratory studies. This paper describes such a comprehensive modeling framework that includes microbially mediated degradation processes, microbial population growth and decay, geochemical reactions, as well as interphase mass transfer processes such as DNAPL dissolution, gas formation and mineral precipitation/dissolution. All these processes can be in equilibrium or kinetically controlled. A batch modeling example was presented where the degradation of trichloroethene (TCE) and its byproducts and concomitant reactions (e.g., electron donor fermentation, sulfate reduction, pH buffering by calcite dissolution) were simulated. Local and global sensitivity analysis techniques were applied to delineate the dominant model parameters and processes. Sensitivity analysis indicated that accurate values for parameters related to dichloroethene (DCE) and vinyl chloride (VC) degradation (i.e., DCE and VC maximum utilization rates, yield due to DCE utilization, decay rate for DCE/VC dechlorinators) are important for prediction of the overall dechlorination time. These parameters influence the maximum growth rate of the DCE and VC dechlorinating microorganisms and, thus, the time required for a small initial population to reach a sufficient concentration to significantly affect the overall rate of dechlorination. Self-inhibition of chlorinated ethenes at high concentrations and natural buffering provided by the sediment were also shown to significantly influence the dechlorination time. Furthermore, the analysis indicated that the rates of the competing, nonchlorinated electron-accepting processes relative to the dechlorination kinetics also affect the overall dechlorination time. Results demonstrated that the model developed is a flexible research tool that is able to provide valuable insight into the fundamental processes and their complex interactions during bioremediation of chlorinated ethenes in DNAPL source zones.
NASA Astrophysics Data System (ADS)
Fedarenka, Anton; Dubovik, Oleg; Goloub, Philippe; Li, Zhengqiang; Lapyonok, Tatyana; Litvinov, Pavel; Barel, Luc; Gonzalez, Louis; Podvin, Thierry; Crozel, Didier
2016-08-01
The study presents the efforts on including the polarimetric data to the routine inversion of the radiometric ground-based measurements for characterization of the atmospheric aerosols and analysis of the obtained advantages in retrieval results. First, to operationally process the large amount of polarimetric data the data preparation tool was developed. The AERONET inversion code adapted for inversion of both intensity and polarization measurements was used for processing. Second, in order to estimate the effect from utilization of polarimetric information on aerosol retrieval results, both synthetic data and the real measurements were processed using developed routine and analyzed. The sensitivity study has been carried out using simulated data based on three main aerosol models: desert dust, urban industrial and urban clean aerosols. The test investigated the effects of utilization of polarization data in the presence of random noise, bias in measurements of optical thickness and angular pointing shift. The results demonstrate the advantage of polarization data utilization in the cases of aerosols with pronounced concentration of fine particles. Further, the extended set of AERONET observations was processed. The data for three sites have been used: GSFC, USA (clean urban aerosol dominated by fine particles), Beijing, China (polluted industrial aerosol characterized by pronounced mixture of both fine and coarse modes) and Dakar, Senegal (desert dust dominated by coarse particles). The results revealed considerable advantage of polarimetric data applying for characterizing fine mode dominated aerosols including industrial pollution (Beijing). The use of polarization corrects particle size distribution by decreasing overestimated fine mode and increasing the coarse mode. It also increases underestimated real part of the refractive index and improves the retrieval of the fraction of spherical particles due to high sensitivity of polarization to particle shape. Overall, the study demonstrates a substantial value of polarimetric data for improving aerosol characterization.
NASA Astrophysics Data System (ADS)
Abbott, Thomas Diamond
2001-07-01
As technology continues to become an integral part of our educational system, research that clarifies how various technologies affect learning should be available to educators prior to the large scale introduction of any new technology into the classroom. This study will assess the degree to which a relatively new Geographic Information System Software (ArcView 3.1), when utilized by high school freshman in earth science and geography courses, can be used to (a) promote and develop integrated process skills in these students, and (b) improve their awareness and appraisal of their problem solving abilities. Two research questions will be addressed in this research: (1) Will the use of a GIS to solve problems with authentic contexts enhance the learning and refinement of integrated process skills over more conventional means of classroom instruction? and (2) Will students' perceptions of competence to solve problems within authentic contexts be greater for those who learned to use and implement a GIS when compared to those who have learned by more conventional means of classroom instruction? Research Question 1 will be assessed by using the Test of Integrated Process Skills II (or TIPS II) and Research Question 2 will be addressed by using the Problem Solving Inventory (PSI). The research will last thirteen weeks. The TIPS II and the PSI will be administered after the intervention of GIS to the experimental group, at which point an Analysis of Covariance and the Mann-Whitney U-test will be utilized to measure the affects of intervention by the independent variable. Teacher/researcher journals and teacher/student questionnaires will be used to compliment the statistical analysis. It is hoped that this study will help in the creation of future instructional models that enable educators to utilize modern technologies appropriately in their classrooms.
Samadbeik, Mahnaz; Shahrokhi, Nafiseh; Saremian, Marzieh; Garavand, Ali; Birjandi, Mahdi
2017-01-01
In recent years, information technology has been introduced in the nursing departments of many hospitals to support their daily tasks. Nurses are the largest end user group in Hospital Information Systems (HISs). This study was designed to evaluate data processing in the Nursing Information Systems (NISs) utilized in many university hospitals in Iran. This was a cross-sectional study. The population comprised all nurse managers and NIS users of the five training hospitals in Khorramabad city ( N = 71). The nursing subset of HIS-Monitor questionnaire was used to collect the data. Data were analyzed by the descriptive-analytical method and the inductive content analysis. The results indicated that the nurses participating in the study did not take a desirable advantage of paper (2.02) and computerized (2.34) information processing tools to perform nursing tasks. Moreover, the less work experience nurses have, the further they utilize computer tools for processing patient discharge information. The "readability of patient information" and "repetitive and time-consuming documentation" were stated as the most important expectations and problems regarding the HIS by the participating nurses, respectively. The nurses participating in the present study used to utilize paper and computerized information processing tools together to perform nursing practices. Therefore, it is recommended that the nursing process redesign coincides with NIS implementation in the health care centers.
Development of Advanced Coatings for Laser Modifications Through Process and Materials Simulation
NASA Astrophysics Data System (ADS)
Martukanitz, R. P.; Babu, S. S.
2004-06-01
A simulation-based system is currently being constructed to aid in the development of advanced coating systems for laser cladding and surface alloying. The system employs loosely coupled material and process models that allow rapid determination of material compatibility over a wide range of processing conditions. The primary emphasis is on the development and identification of composite coatings for improved wear and corrosion resistance. The material model utilizes computational thermodynamics and kinetic analysis to establish phase stability and extent of diffusional reactions that may result from the thermal response of the material during virtual processing. The process model is used to develop accurate thermal histories associated with the laser surface modification process and provides critical input for the non-isothermal materials simulations. These techniques were utilized to design a laser surface modification experiment that utilized the addition of stainless steel alloy 431 and TiC produced using argon and argon and nitrogen shielding. The deposits representing alloy 431 and TiC powder produced in argon resulted in microstructures retaining some TiC particles and an increase in hardness when compared to deposits produced using only the 431 powder. Laser deposits representing alloy 431 and TiC powder produced with a mixture of argon and nitrogen shielding gas resulted in microstructures retaining some TiC particles, as well as fine precipitates of Ti(CN) formed during cooling and a further increase in hardness of the deposit.
Using IBMs to Investigate Spatially-dependent Processes in Landscape Genetics Theory
Much of landscape and conservation genetics theory has been derived using non-spatialmathematical models. Here, we use a mechanistic, spatially-explicit, eco-evolutionary IBM to examine the utility of this theoretical framework in landscapes with spatial structure. Our analysis...
Preliminary techno-economic analysis of these processes will be undertaken, utilizing the literature and including key supporting data and proof-of-principle experiments. The emphasis on low-cost bioreactors and operation greatly enhances the economic feasibility and practica...
Automatic bio-sample bacteria detection system
NASA Technical Reports Server (NTRS)
Chappelle, E. W.; Colburn, M.; Kelbaugh, B. N.; Picciolo, G. L.
1971-01-01
Electromechanical device analyzes urine specimens in 15 minutes and processes one sample per minute. Instrument utilizes bioluminescent reaction between luciferase-luciferin mixture and adenosine triphosphate (ATP) to determine number of bacteria present in the sample. Device has potential application to analysis of other body fluids.
Land Use Management for Solid Waste Programs
ERIC Educational Resources Information Center
Brown, Sanford M., Jr.
1974-01-01
The author discusses the problems of solid waste disposal and examines various land use management techniques. These include the land use plan, zoning, regionalization, land utilities, and interim use. Information concerning solid waste processing site zoning and analysis is given. Bibliography included. (MA)
Effects of eHealth Literacy on General Practitioner Consultations: A Mediation Analysis
Fitzpatrick, Mary Anne; Hess, Alexandra; Sudbury-Riley, Lynn; Hartung, Uwe
2017-01-01
Background Most evidence (not all) points in the direction that individuals with a higher level of health literacy will less frequently utilize the health care system than individuals with lower levels of health literacy. The underlying reasons of this effect are largely unclear, though people’s ability to seek health information independently at the time of wide availability of such information on the Internet has been cited in this context. Objective We propose and test two potential mediators of the negative effect of eHealth literacy on health care utilization: (1) health information seeking and (2) gain in empowerment by information seeking. Methods Data were collected in New Zealand, the United Kingdom, and the United States using a Web-based survey administered by a company specialized on providing online panels. Combined, the three samples resulted in a total of 996 baby boomers born between 1946 and 1965 who had used the Internet to search for and share health information in the previous 6 months. Measured variables include eHealth literacy, Internet health information seeking, the self-perceived gain in empowerment by that information, and the number of consultations with one’s general practitioner (GP). Path analysis was employed for data analysis. Results We found a bundle of indirect effect paths showing a positive relationship between health literacy and health care utilization: via health information seeking (Path 1), via gain in empowerment (Path 2), and via both (Path 3). In addition to the emergence of these indirect effects, the direct effect of health literacy on health care utilization disappeared. Conclusions The indirect paths from health literacy via information seeking and empowerment to GP consultations can be interpreted as a dynamic process and an expression of the ability to find, process, and understand relevant information when that is necessary. PMID:28512081
NASA Technical Reports Server (NTRS)
Hickey, J. S.
1983-01-01
The Mesoscale Analysis and Space Sensor (MASS) Data Management and Analysis System developed by Atsuko Computing International (ACI) on the MASS HP-1000 Computer System within the Systems Dynamics Laboratory of the Marshall Space Flight Center is described. The MASS Data Management and Analysis System was successfully implemented and utilized daily by atmospheric scientists to graphically display and analyze large volumes of conventional and satellite derived meteorological data. The scientists can process interactively various atmospheric data (Sounding, Single Level, Gird, and Image) by utilizing the MASS (AVE80) share common data and user inputs, thereby reducing overhead, optimizing execution time, and thus enhancing user flexibility, useability, and understandability of the total system/software capabilities. In addition ACI installed eight APPLE III graphics/imaging computer terminals in individual scientist offices and integrated them into the MASS HP-1000 Computer System thus providing significant enhancement to the overall research environment.
Analysis of ecological environment impact of coal exploitation and utilization
NASA Astrophysics Data System (ADS)
Zhang, Baoliu; Luo, Hong; Lv, Lianhong; Wang, Jian; Zhang, Baoshi
2018-02-01
Based on the theory of life cycle assessment, the ecological and environmental impacts of coal mining, processing, utilization and transportation will be analyzed, with analysing the status of china’s coal exploitation and utilization as the basis, it will find out the ecological and environmental impact in the development and utilization of coal, mainly consist of ecological impact including land damage, water resource destructionand biodiversity loss, etc., while the environmental impact include air, water, solid waste pollutions. Finally with a summary of the ecological and environmental problems, to propose solutionsand countermeasures to promote the rational development and consumption of coal, as well as to reduce the impact of coal production and consumption on the ecological environment, finally to achieve the coordinated development of energy and the environment.
Resources and training in outpatient substance abuse treatment facilities.
Lehman, Wayne E K; Becan, Jennifer E; Joe, George W; Knight, Danica K; Flynn, Patrick M
2012-03-01
The exposure to new clinical interventions through formalized training and the utilization of strategies learned through training are two critical components of the program change process. This study considers the combined influence of actual program fiscal resources and counselors' perceptions of workplace resources on two mechanisms of training: exposure and utilization. Data were collected from 323 counselors nested within 59 programs located in nine states. Multilevel analysis revealed that training exposure and training utilization represent two distinct constructs that are important at different stages in the Program Change Model. Training exposure is associated primarily with physical and financial resources, whereas utilization is associated with professional community and job burnout. These results suggest that financial resources are important in initial exposure to new interventions but that successful utilization of new techniques depends in part on the degree of burnout and collaboration experienced by counselors. Copyright © 2012 Elsevier Inc. All rights reserved.
Resources and Training in Outpatient Substance Abuse Treatment Facilities
Lehman, Wayne E. K.; Becan, Jennifer E.; Joe, George W.; Knight, Danica K.; Flynn, Patrick M.
2011-01-01
The exposure to new clinical interventions through formalized training and the utilization of strategies learned through training are two critical components of the program change process. The current study considers the combined influence of actual program fiscal resources and counselors’ perceptions of workplace resources on two mechanisms of training: exposure and utilization. Data were collected from 323 counselors nested within 59 programs located in 9 states. Multilevel analysis revealed that training exposure and training utilization represent two distinct constructs that are important at different stages in the Program Change Model. Training exposure is associated primarily with physical and financial resources, whereas utilization is associated with professional community and job burnout. These results suggest that financial resources are important in initial exposure to new interventions, but that successful utilization of new techniques depends in part on the degree of burnout and collaboration experienced by counselors. PMID:22154031
Squires, Janet E; Estabrooks, Carole A; Newburn-Cook, Christine V; Gierl, Mark
2011-05-19
There is a lack of acceptable, reliable, and valid survey instruments to measure conceptual research utilization (CRU). In this study, we investigated the psychometric properties of a newly developed scale (the CRU Scale). We used the Standards for Educational and Psychological Testing as a validation framework to assess four sources of validity evidence: content, response processes, internal structure, and relations to other variables. A panel of nine international research utilization experts performed a formal content validity assessment. To determine response process validity, we conducted a series of one-on-one scale administration sessions with 10 healthcare aides. Internal structure and relations to other variables validity was examined using CRU Scale response data from a sample of 707 healthcare aides working in 30 urban Canadian nursing homes. Principal components analysis and confirmatory factor analyses were conducted to determine internal structure. Relations to other variables were examined using: (1) bivariate correlations; (2) change in mean values of CRU with increasing levels of other kinds of research utilization; and (3) multivariate linear regression. Content validity index scores for the five items ranged from 0.55 to 1.00. The principal components analysis predicted a 5-item 1-factor model. This was inconsistent with the findings from the confirmatory factor analysis, which showed best fit for a 4-item 1-factor model. Bivariate associations between CRU and other kinds of research utilization were statistically significant (p < 0.01) for the latent CRU scale score and all five CRU items. The CRU scale score was also shown to be significant predictor of overall research utilization in multivariate linear regression. The CRU scale showed acceptable initial psychometric properties with respect to responses from healthcare aides in nursing homes. Based on our validity, reliability, and acceptability analyses, we recommend using a reduced (four-item) version of the CRU scale to yield sound assessments of CRU by healthcare aides. Refinement to the wording of one item is also needed. Planned future research will include: latent scale scoring, identification of variables that predict and are outcomes to conceptual research use, and longitudinal work to determine CRU Scale sensitivity to change.
Knowledge and utilization of computer-software for statistics among Nigerian dentists.
Chukwuneke, F N; Anyanechi, C E; Obiakor, A O; Amobi, O; Onyejiaka, N; Alamba, I
2013-01-01
The use of computer soft ware for generation of statistic analysis has transformed health information and data to simplest form in the areas of access, storage, retrieval and analysis in the field of research. This survey therefore was carried out to assess the level of knowledge and utilization of computer software for statistical analysis among dental researchers in eastern Nigeria. Questionnaires on the use of computer software for statistical analysis were randomly distributed to 65 practicing dental surgeons of above 5 years experience in the tertiary academic hospitals in eastern Nigeria. The focus was on: years of clinical experience; research work experience; knowledge and application of computer generated software for data processing and stastistical analysis. Sixty-two (62/65; 95.4%) of these questionnaires were returned anonymously, which were used in our data analysis. Twenty-nine (29/62; 46.8%) respondents fall within those with 5-10 years of clinical experience out of which none has completed the specialist training programme. Practitioners with above 10 years clinical experiences were 33 (33/62; 53.2%) out of which 15 (15/33; 45.5%) are specialists representing 24.2% (15/62) of the total number of respondents. All the 15 specialists are actively involved in research activities and only five (5/15; 33.3%) can utilize software statistical analysis unaided. This study has i dentified poor utilization of computer software for statistic analysis among dental researchers in eastern Nigeria. This is strongly associated with lack of exposure on the use of these software early enough especially during the undergraduate training. This call for introduction of computer training programme in dental curriculum to enable practitioners develops the attitude of using computer software for their research.
Proceedings of the 21st Project Integration Meeting
NASA Technical Reports Server (NTRS)
1983-01-01
Progress made by the Flat Plate Solar Array Project during the period April 1982 to January 1983 is described. Reports on polysilicon refining, thin film solar cell and module technology development, central station electric utility activities, silicon sheet growth and characteristics, advanced photovoltaic materials, cell and processes research, module technology, environmental isolation, engineering sciences, module performance and failure analysis and project analysis and integration are included.
Techno-economic analysis for the evaluation of three UCG synthesis gas end use approaches
NASA Astrophysics Data System (ADS)
Nakaten, Natalie; Kempka, Thomas; Burchart-Korol, Dorota; Krawczyk, Piotr; Kapusta, Krzysztof; Stańczyk, Krzysztof
2016-04-01
Underground coal gasification (UCG) enables the utilization of coal reserves that are economically not exploitable because of complex geological boundary conditions. In the present study we investigate UCG as a potential economic approach for conversion of deep-seated coals into a synthesis gas and its application within three different utilization options. Related to geological boundary conditions and the chosen gasification agent, UCG synthesis gas composes of varying methane, hydrogen, nitrogen, carbon monoxide and carbon dioxide amounts. In accordance to its calorific value, the processed UCG synthesis gas can be utilized in different manners, as for electricity generation in a combined cycle power plant or for feedstock production making use of its various chemical components. In the present study we analyze UCG synthesis gas utilization economics in the context of clean electricity generation with an integrated carbon capture and storage process (CCS) as well as synthetic fuel and fertilizer production (Kempka et al., 2010) based on a gas composition achieved during an in situ UCG trial in the Wieczorek Mine. Hereby, we also consider chemical feedstock production in order to mitigate CO2 emissions. Within a sensitivity analysis of UCG synthesis gas calorific value variations, we produce a range of capital and operational expenditure bandwidths that allow for an economic assessment of different synthesis gas end use approaches. To carry out the integrated techno-economic assessment of the coupled systems and the sensitivity analysis, we adapted the techno-economic UCG-CCS model developed by Nakaten et al. (2014). Our techno-economic modeling results demonstrate that the calorific value has a high impact on the economics of UCG synthesis gas utilization. In the underlying study, the synthesis gas is not suitable for an economic competitive electricity generation, due to the relatively low calorific value of 4.5 MJ/Nm³. To be a profitable option for electricity production, the UCG synthesis gas should have a calorific value of at least 7 MJ/Nm³. However, UCG feedstock production in view of the underlying geological and chemical boundary conditions can compete on the market. Kempka, T., Plötz, M.L., Hamann, J., Deowan, S.A., Azzam, R. (2010) Carbon dioxide utilisation for carbamide production by application of the coupled UCG-urea process. Energy Procedia 4: 2200-2205. Nakaten, N., Schlüter, R., Azzam, R., Kempka, T. (2014) Development of a techno-economic model for dynamic calculation of COE, energy demand and CO2 emissions of an integrated UCG-CCS process. Energy (in print). Doi 10.1016/j.energy.2014.01.014
NASA Astrophysics Data System (ADS)
Di Lorenzo, R.; Ingarao, G.; Fonti, V.
2007-05-01
The crucial task in the prevention of ductile fracture is the availability of a tool for the prediction of such defect occurrence. The technical literature presents a wide investigation on this topic and many contributions have been given by many authors following different approaches. The main class of approaches regards the development of fracture criteria: generally, such criteria are expressed by determining a critical value of a damage function which depends on stress and strain paths: ductile fracture is assumed to occur when such critical value is reached during the analysed process. There is a relevant drawback related to the utilization of ductile fracture criteria; in fact each criterion usually has good performances in the prediction of fracture for particular stress - strain paths, i.e. it works very well for certain processes but may provide no good results for other processes. On the other hand, the approaches based on damage mechanics formulation are very effective from a theoretical point of view but they are very complex and their proper calibration is quite difficult. In this paper, two different approaches are investigated to predict fracture occurrence in cold forming operations. The final aim of the proposed method is the achievement of a tool which has a general reliability i.e. it is able to predict fracture for different forming processes. The proposed approach represents a step forward within a research project focused on the utilization of innovative predictive tools for ductile fracture. The paper presents a comparison between an artificial neural network design procedure and an approach based on statistical tools; both the approaches were aimed to predict fracture occurrence/absence basing on a set of stress and strain paths data. The proposed approach is based on the utilization of experimental data available, for a given material, on fracture occurrence in different processes. More in detail, the approach consists in the analysis of experimental tests in which fracture occurs followed by the numerical simulations of such processes in order to track the stress-strain paths in the workpiece region where fracture is expected. Such data are utilized to build up a proper data set which was utilized both to train an artificial neural network and to perform a statistical analysis aimed to predict fracture occurrence. The developed statistical tool is properly designed and optimized and is able to recognize the fracture occurrence. The reliability and predictive capability of the statistical method were compared with the ones obtained from an artificial neural network developed to predict fracture occurrence. Moreover, the approach is validated also in forming processes characterized by a complex fracture mechanics.
Pre and post processing using the IBM 3277 display station graphics attachment (RPQ7H0284)
NASA Technical Reports Server (NTRS)
Burroughs, S. H.; Lawlor, M. B.; Miller, I. M.
1978-01-01
A graphical interactive procedure operating under TSO and utilizing two CRT display terminals is shown to be an effective means of accomplishing mesh generation, establishing boundary conditions, and reviewing graphic output for finite element analysis activity.
USDA-ARS?s Scientific Manuscript database
Initial screening for bacteriophages lytic for Clostridium perfringens was performed utilizing filtered samples obtained from poultry (intestinal material), soil, sewage and poultry processing drainage water. Lytic phage preparations were initially characterized by transmission electron microscopy ...
How Do Land-Use and Climate Change Affect Watershed Health? A Scenario-Based Analysis
With the growing emphasis on biofuel crops and potential impacts of climate variability and change, there is a need to quantify their effects on hydrological processes for developing watershed management plans. Environmental consequences are currently estimated by utilizing comp...
A COMPARATIVE ANALYSIS OF THE RESEARCH UTILIZATION PROCESS.
ERIC Educational Resources Information Center
LIPPITT, RONALD; AND OTHERS
A SUGGESTED MODEL FOR ADEQUATE DISSEMINATION OF RESEARCH FINDINGS CONSIDERS FOUR PRIMARY BARRIERS TO EFFECTIVE COMMUNICATION--(1) DIVISION OF PERSONNEL LABOR INTO TASK ROLES, (2) INSTITUTIONAL DISTINCTIONS, (3) DEVELOPMENT OF PROFESSIONAL REFERENCE GROUPS, AND (4) GEOGRAPHICAL DIVISIONS. SUGGESTED SOLUTIONS INCLUDE LINKING SYSTEMS AND ROLES,…
Promoting Multicultural Awareness through Electronic Communication
ERIC Educational Resources Information Center
Huang, Hui-Ju
2006-01-01
This project utilized computer technology to establish an email discussion forum for communication and learning in which students shared information, ideas, and processes of learning multicultural education. This paper presents the quantitative count of email messages and qualitative analysis of students' perceptions of email discussions. It then…
Code of Federal Regulations, 2014 CFR
2014-01-01
... serve the load. Eligible borrower means a utility system that has direct or indirect responsibility for... analysis of energy flows in a building, process, or system with the goal of identifying opportunities to... output. HVAC means heating, ventilation, and air conditioning. Load means the Power delivered to power...
Pharmacist. Occupational Simulation Kit.
ERIC Educational Resources Information Center
Parsley, Nancy
This career exploration instructional booklet on the pharmacist's occupation is one of several resulting from the rural southwestern Colorado CEPAC Project (Career Education Process of Attitude Change). Based on a job analysis and utilizing a programed instructional format, the following content is included: A brief description of two real…
Collaborative analysis of wheat endosperm compressive material properties
USDA-ARS?s Scientific Manuscript database
The objective measurement of cereal endosperm texture, for wheat (Triticum L.) in particular, is relevant to the milling, processing and utilization of grain. The objective of this study was to evaluate the inter-laboratory results of compression failure testing of wheat endosperm specimens of defi...
NASA Technical Reports Server (NTRS)
Wilcox, R. E. (Compiler)
1983-01-01
Planned research efforts and reorganization of the Project as the Biocatalysis Research Activity are described, including the following topics: electrocatalysts, fluid extraction, ammonia synthesis, biocatalysis, membrane fouling, energy and economic analysis, decarboxylation, microscopic reaction models, plasmid monitoring, and reaction kinetics.
Atsuta, Yoshiko
2016-01-01
Collection and analysis of information on diseases and post-transplant courses of allogeneic hematopoietic stem cell transplant recipients have played important roles in improving therapeutic outcomes in hematopoietic stem cell transplantation. Efficient, high-quality data collection systems are essential. The introduction of the Second-Generation Transplant Registry Unified Management Program (TRUMP2) is intended to improve data quality and more efficient data management. The TRUMP2 system will also expand possible uses of data, as it is capable of building a more complex relational database. The construction of an accessible data utilization system for adequate data utilization by researchers would promote greater research activity. Study approval and management processes and authorship guidelines also need to be organized within this context. Quality control of processes for data manipulation and analysis will also affect study outcomes. Shared scripts have been introduced to define variables according to standard definitions for quality control and improving efficiency of registry studies using TRUMP data.
NASA Technical Reports Server (NTRS)
Davidson, J.; Ottey, H. R.; Sawitz, P.; Zusman, F. S.
1985-01-01
The underlying engineering and mathematical models as well as the computational methods used by the Spectrum Orbit Utilization Program 5 (SOUP5) analysis programs are described. Included are the algorithms used to calculate the technical parameters, and references to the technical literature. The organization, capabilities, processing sequences, and processing and data options of the SOUP5 system are described. The details of the geometric calculations are given. Also discussed are the various antenna gain algorithms; rain attenuation and depolarization calculations; calculations of transmitter power and received power flux density; channelization options, interference categories, and protection ratio calculation; generation of aggregrate interference and margins; equivalent gain calculations; and how to enter a protection ratio template.
Mining of high utility-probability sequential patterns from uncertain databases
Zhang, Binbin; Fournier-Viger, Philippe; Li, Ting
2017-01-01
High-utility sequential pattern mining (HUSPM) has become an important issue in the field of data mining. Several HUSPM algorithms have been designed to mine high-utility sequential patterns (HUPSPs). They have been applied in several real-life situations such as for consumer behavior analysis and event detection in sensor networks. Nonetheless, most studies on HUSPM have focused on mining HUPSPs in precise data. But in real-life, uncertainty is an important factor as data is collected using various types of sensors that are more or less accurate. Hence, data collected in a real-life database can be annotated with existing probabilities. This paper presents a novel pattern mining framework called high utility-probability sequential pattern mining (HUPSPM) for mining high utility-probability sequential patterns (HUPSPs) in uncertain sequence databases. A baseline algorithm with three optional pruning strategies is presented to mine HUPSPs. Moroever, to speed up the mining process, a projection mechanism is designed to create a database projection for each processed sequence, which is smaller than the original database. Thus, the number of unpromising candidates can be greatly reduced, as well as the execution time for mining HUPSPs. Substantial experiments both on real-life and synthetic datasets show that the designed algorithm performs well in terms of runtime, number of candidates, memory usage, and scalability for different minimum utility and minimum probability thresholds. PMID:28742847
A comparison of methods for evaluating structure during ship collisions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ammerman, D.J.; Daidola, J.C.
1996-10-01
A comparison is provided of the results of various methods for evaluating structure during a ship-to-ship collision. The baseline vessel utilized in the analyses is a 67.4 meter in length displacement hull struck by an identical vessel traveling at speeds ranging from 10 to 30 knots. The structural response of the struck vessel and motion of both the struck and striking vessels are assessed by finite element analysis. These same results are then compared to predictions utilizing the {open_quotes}Tanker Structural Analysis for Minor Collisions{close_quotes} (TSAMC) Method, the Minorsky Method, the Haywood Collision Process, and comparison to full-scale tests. Consideration ismore » given to the nature of structural deformation, absorbed energy, penetration, rigid body motion, and virtual mass affecting the hydrodynamic response. Insights are provided with regard to the calibration of the finite element model which was achievable through utilizing the more empirical analyses and the extent to which the finite element analysis is able to simulate the entire collision event. 7 refs., 8 figs., 4 tabs.« less
Colossal Tooling Design: 3D Simulation for Ergonomic Analysis
NASA Technical Reports Server (NTRS)
Hunter, Steve L.; Dischinger, Charles; Thomas, Robert E.; Babai, Majid
2003-01-01
The application of high-level 3D simulation software to the design phase of colossal mandrel tooling for composite aerospace fuel tanks was accomplished to discover and resolve safety and human engineering problems. The analyses were conducted to determine safety, ergonomic and human engineering aspects of the disassembly process of the fuel tank composite shell mandrel. Three-dimensional graphics high-level software, incorporating various ergonomic analysis algorithms, was utilized to determine if the process was within safety and health boundaries for the workers carrying out these tasks. In addition, the graphical software was extremely helpful in the identification of material handling equipment and devices for the mandrel tooling assembly/disassembly process.
NASA Astrophysics Data System (ADS)
Siahaan, P.; Suryani, A.; Kaniawati, I.; Suhendi, E.; Samsudin, A.
2017-02-01
The purpose of this research is to identify the development of students’ science process skills (SPS) on linear motion concept by utilizing simple computer simulation. In order to simplify the learning process, the concept is able to be divided into three sub-concepts: 1) the definition of motion, 2) the uniform linear motion and 3) the uniformly accelerated motion. This research was administered via pre-experimental method with one group pretest-posttest design. The respondents which were involved in this research were 23 students of seventh grade in one of junior high schools in Bandung City. The improving process of students’ science process skill is examined based on normalized gain analysis from pretest and posttest scores for all sub-concepts. The result of this research shows that students’ science process skills are dramatically improved by 47% (moderate) on observation skill; 43% (moderate) on summarizing skill, 70% (high) on prediction skill, 44% (moderate) on communication skill and 49% (moderate) on classification skill. These results clarify that the utilizing simple computer simulations in physics learning is be able to improve overall science skills at moderate level.
Peleato, Nicolas M; Legge, Raymond L; Andrews, Robert C
2018-06-01
The use of fluorescence data coupled with neural networks for improved predictability of drinking water disinfection by-products (DBPs) was investigated. Novel application of autoencoders to process high-dimensional fluorescence data was related to common dimensionality reduction techniques of parallel factors analysis (PARAFAC) and principal component analysis (PCA). The proposed method was assessed based on component interpretability as well as for prediction of organic matter reactivity to formation of DBPs. Optimal prediction accuracies on a validation dataset were observed with an autoencoder-neural network approach or by utilizing the full spectrum without pre-processing. Latent representation by an autoencoder appeared to mitigate overfitting when compared to other methods. Although DBP prediction error was minimized by other pre-processing techniques, PARAFAC yielded interpretable components which resemble fluorescence expected from individual organic fluorophores. Through analysis of the network weights, fluorescence regions associated with DBP formation can be identified, representing a potential method to distinguish reactivity between fluorophore groupings. However, distinct results due to the applied dimensionality reduction approaches were observed, dictating a need for considering the role of data pre-processing in the interpretability of the results. In comparison to common organic measures currently used for DBP formation prediction, fluorescence was shown to improve prediction accuracies, with improvements to DBP prediction best realized when appropriate pre-processing and regression techniques were applied. The results of this study show promise for the potential application of neural networks to best utilize fluorescence EEM data for prediction of organic matter reactivity. Copyright © 2018 Elsevier Ltd. All rights reserved.
Compact full-motion video hyperspectral cameras: development, image processing, and applications
NASA Astrophysics Data System (ADS)
Kanaev, A. V.
2015-10-01
Emergence of spectral pixel-level color filters has enabled development of hyper-spectral Full Motion Video (FMV) sensors operating in visible (EO) and infrared (IR) wavelengths. The new class of hyper-spectral cameras opens broad possibilities of its utilization for military and industry purposes. Indeed, such cameras are able to classify materials as well as detect and track spectral signatures continuously in real time while simultaneously providing an operator the benefit of enhanced-discrimination-color video. Supporting these extensive capabilities requires significant computational processing of the collected spectral data. In general, two processing streams are envisioned for mosaic array cameras. The first is spectral computation that provides essential spectral content analysis e.g. detection or classification. The second is presentation of the video to an operator that can offer the best display of the content depending on the performed task e.g. providing spatial resolution enhancement or color coding of the spectral analysis. These processing streams can be executed in parallel or they can utilize each other's results. The spectral analysis algorithms have been developed extensively, however demosaicking of more than three equally-sampled spectral bands has been explored scarcely. We present unique approach to demosaicking based on multi-band super-resolution and show the trade-off between spatial resolution and spectral content. Using imagery collected with developed 9-band SWIR camera we demonstrate several of its concepts of operation including detection and tracking. We also compare the demosaicking results to the results of multi-frame super-resolution as well as to the combined multi-frame and multiband processing.
Hydromagnetic couple-stress nanofluid flow over a moving convective wall: OHAM analysis
NASA Astrophysics Data System (ADS)
Awais, M.; Saleem, S.; Hayat, T.; Irum, S.
2016-12-01
This communication presents the magnetohydrodynamics (MHD) flow of a couple-stress nanofluid over a convective moving wall. The flow dynamics are analyzed in the boundary layer region. Convective cooling phenomenon combined with thermophoresis and Brownian motion effects has been discussed. Similarity transforms are utilized to convert the system of partial differential equations into coupled non-linear ordinary differential equation. Optimal homotopy analysis method (OHAM) is utilized and the concept of minimization is employed by defining the average squared residual errors. Effects of couple-stress parameter, convective cooling process parameter and energy enhancement parameters are displayed via graphs and discussed in detail. Various tables are also constructed to present the error analysis and a comparison of obtained results with the already published data. Stream lines are plotted showing a difference of Newtonian fluid model and couplestress fluid model.
Shahzad, Khurram; Narodoslawsky, Michael; Sagir, Muhammad; Ali, Nadeem; Ali, Shahid; Rashid, Muhammad Imtiaz; Ismail, Iqbal Mohammad Ibrahim; Koller, Martin
2017-09-01
The utilization of industrial waste streams as input materials for bio-mediated production processes constitutes a current R&D objective not only to reduce process costs at the input side but in parallel, to minimize hazardous environmental emissions. In this context, the EU-funded project ANIMPOL elaborated a process for the production of polyhydroxyalkanoate (PHA) biopolymers starting from diverse waste streams of the animal processing industry. This article provides a detailed economic analysis of PHA production from this waste biorefinery concept, encompassing the utilization of low-quality biodiesel, offal material and meat and bone meal (MBM). Techno-economic analysis reveals that PHA production cost varies from 1.41 €/kg to 1.64 €/kg when considering offal on the one hand as waste, or, on the other hand, accounting its market price, while calculating with fixed costs for the co-products biodiesel (0.97 €/L) and MBM (350 €/t), respectively. The effect of fluctuating market prices for offal materials, biodiesel, and MBM on the final PHA production cost as well as the investment payback time have been evaluated. Depending on the current market situation, the calculated investment payback time varies from 3.25 to 4.5years. Copyright © 2017 Elsevier Ltd. All rights reserved.
In Silico Analysis of Putrefaction Pathways in Bacteria and Its Implication in Colorectal Cancer
Kaur, Harrisham; Das, Chandrani; Mande, Sharmila S.
2017-01-01
Fermentation of undigested proteins in human gastrointestinal tract (gut) by the resident microbiota, a process called bacterial putrefaction, can sometimes disrupt the gut homeostasis. In this process, essential amino acids (e.g., histidine, tryptophan, etc.) that are required by the host may be utilized by the gut microbes. In addition, some of the products of putrefaction, like ammonia, putrescine, cresol, indole, phenol, etc., have been implicated in the disease pathogenesis of colorectal cancer (CRC). We have investigated bacterial putrefaction pathways that are known to be associated with such metabolites. Results of the comprehensive in silico analysis of the selected putrefaction pathways across bacterial genomes revealed presence of these pathways in limited bacterial groups. Majority of these bacteria are commonly found in human gut. These include Bacillus, Clostridium, Enterobacter, Escherichia, Fusobacterium, Salmonella, etc. Interestingly, while pathogens utilize almost all the analyzed pathways, commensals prefer putrescine and H2S production pathways for metabolizing the undigested proteins. Further, comparison of the putrefaction pathways in the gut microbiomes of healthy, carcinoma and adenoma datasets indicate higher abundances of putrefying bacteria in the carcinoma stage of CRC. The insights obtained from the present study indicate utilization of possible microbiome-based therapies to minimize the adverse effects of gut microbiome in enteric diseases. PMID:29163445
NASA Technical Reports Server (NTRS)
Deckman, G.; Rousseau, J. (Editor)
1973-01-01
The Wash Water Recovery System (WWRS) is intended for use in processing shower bath water onboard a spacecraft. The WWRS utilizes flash evaporation, vapor compression, and pyrolytic reaction to process the wash water to allow recovery of potable water. Wash water flashing and foaming characteristics, are evaluated physical properties, of concentrated wash water are determined, and a long term feasibility study on the system is performed. In addition, a computer analysis of the system and a detail design of a 10 lb/hr vortex-type water vapor compressor were completed. The computer analysis also sized remaining system components on the basis of the new vortex compressor design.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lux, Kenneth; Imam, Thamina; Chevanan, Nehru
This Final Technical Report describes the work and accomplishments of the project entitled, “Green-House-Gas-Reduced Coal-and-Biomass-to-Liquid-Based Jet Fuel (GHGR-CBTL) Process”. The main objective of the project was to raise the Technology Readiness Level (TRL) of the GHGR-CBTL fuel-production technology from TRL 4 to TRL 5 by producing a drop-in synthetic Jet Propellant 8 (JP-8) with a greenhouse-gas footprint less than or equal to petroleum-based JP-8 by utilizing mixtures of coal and biomass as the feedstock. The system utilizes the patented Altex fuel-production technology, which incorporates advanced catalysts developed by Pennsylvania State University. While the system was not fabricated and tested, majormore » efforts were expended to design the 1-TPD and a full-scale plant. The system was designed, a Block-Flow Diagram (BFD), a Process-Flow Diagram (PFD), and Piping-and-Instrumentation Diagrams (P&IDs) were produced, a Bill of Materials (BOM) and associated spec sheets were produced, commercially available components were selected and procured, custom components were designed and fabricated, catalysts were developed and screened for performance, and permitting activities were conducted. Optimization tests for JP-8 production using C2 olefin as the feed were performed over a range of temperatures, pressures and WHSVs. Liquid yields of between 63 to 65% with 65% JP-8 fraction (41-42% JP-8 yield) at 50 psig were achieved. Life-Cycle Analysis (LCA) was performed by Argonne National Laboratory (ANL), and a GHGR-CBTL module was added to the Greenhouse gases, Regulated Emissions, and Energy use in Transportation (GREET®) model. Based upon the experimental results, the plant design was reconfigured for zero natural-gas imports and minimal electricity imports. The LCA analysis of the reconfigured process utilizing the GREET model showed that if the char from the process was utilized to produce combined heat and power (CHP) then a feed containing 23 wt% biomass and 77 wt% lignite would be needed for parity with petroleum-based JP-8. If the char is not utilized for CHP, but sequestered in a land fill, 24 wt% biomass and 76 wt% lignite would be required. A TEA was performed on this configuration following DOE guidelines and using the ANL-developed GREET module that showed that the GHGR-CBTL TOC and ECO are 69% and 58% of those for the DOE FT-Liquids Baseline Case, respectively. This analysis shows that the economics of the GHGR-CBTL process are significantly better than a gasification/FT process. No technical barriers were identified. The lower costs and the detailed design that was performed under this project are being used by Altex to attract funding partners to move the GHGR-CBTL development forward.« less
Image detection and compression for memory efficient system analysis
NASA Astrophysics Data System (ADS)
Bayraktar, Mustafa
2015-02-01
The advances in digital signal processing have been progressing towards efficient use of memory and processing. Both of these factors can be utilized efficiently by using feasible techniques of image storage by computing the minimum information of image which will enhance computation in later processes. Scale Invariant Feature Transform (SIFT) can be utilized to estimate and retrieve of an image. In computer vision, SIFT can be implemented to recognize the image by comparing its key features from SIFT saved key point descriptors. The main advantage of SIFT is that it doesn't only remove the redundant information from an image but also reduces the key points by matching their orientation and adding them together in different windows of image [1]. Another key property of this approach is that it works on highly contrasted images more efficiently because it`s design is based on collecting key points from the contrast shades of image.
Automation of a N-S S and C Database Generation for the Harrier in Ground Effect
NASA Technical Reports Server (NTRS)
Murman, Scott M.; Chaderjian, Neal M.; Pandya, Shishir; Kwak, Dochan (Technical Monitor)
2001-01-01
A method of automating the generation of a time-dependent, Navier-Stokes static stability and control database for the Harrier aircraft in ground effect is outlined. Reusable, lightweight components arc described which allow different facets of the computational fluid dynamic simulation process to utilize a consistent interface to a remote database. These components also allow changes and customizations to easily be facilitated into the solution process to enhance performance, without relying upon third-party support. An analysis of the multi-level parallel solver OVERFLOW-MLP is presented, and the results indicate that it is feasible to utilize large numbers of processors (= 100) even with a grid system with relatively small number of cells (= 10(exp 6)). A more detailed discussion of the simulation process, as well as refined data for the scaling of the OVERFLOW-MLP flow solver will be included in the full paper.
Atmospheric Modeling And Sensor Simulation (AMASS) study
NASA Technical Reports Server (NTRS)
Parker, K. G.
1984-01-01
The capabilities of the atmospheric modeling and sensor simulation (AMASS) system were studied in order to enhance them. This system is used in processing atmospheric measurements which are utilized in the evaluation of sensor performance, conducting design-concept simulation studies, and also in the modeling of the physical and dynamical nature of atmospheric processes. The study tasks proposed in order to both enhance the AMASS system utilization and to integrate the AMASS system with other existing equipment to facilitate the analysis of data for modeling and image processing are enumerated. The following array processors were evaluated for anticipated effectiveness and/or improvements in throughput by attachment of the device to the P-e: (1) Floating Point Systems AP-120B; (2) Floating Point Systems 5000; (3) CSP, Inc. MAP-400; (4) Analogic AP500; (5) Numerix MARS-432; and (6) Star Technologies, Inc. ST-100.
Identification of nodes and internodes of chopped biomass stems by Image analysis
USDA-ARS?s Scientific Manuscript database
Separating the morphological components of biomass leads to better handling, more efficient processing as well as value added product generation, as these components vary in their chemical composition and can be preferentially utilized. Nodes and internodes of biomass stems have distinct chemical co...
Santa Rosa Island Mission Utilization Plan Programmatic Environmental Assessment
2005-03-01
subject areas with the greatest likelihood for potential environmental impacts. In each case, the assessment found that the preferred alternative would...7061, "The Environmental Impact Analysis Process"). Selection of Alternative 3, the preferred alternative, of the Santa Rosa Island Mission... Preferred Alternative ...............................................................................................................................2-12
Deng, Bo; Shi, Yaoyao; Yu, Tao; Kang, Chao; Zhao, Pan
2018-01-31
The composite tape winding process, which utilizes a tape winding machine and prepreg tapes, provides a promising way to improve the quality of composite products. Nevertheless, the process parameters of composite tape winding have crucial effects on the tensile strength and void content, which are closely related to the performances of the winding products. In this article, two different object values of winding products, including mechanical performance (tensile strength) and a physical property (void content), were respectively calculated. Thereafter, the paper presents an integrated methodology by combining multi-parameter relative sensitivity analysis and single-parameter sensitivity analysis to obtain the optimal intervals of the composite tape winding process. First, the global multi-parameter sensitivity analysis method was applied to investigate the sensitivity of each parameter in the tape winding processing. Then, the local single-parameter sensitivity analysis method was employed to calculate the sensitivity of a single parameter within the corresponding range. Finally, the stability and instability ranges of each parameter were distinguished. Meanwhile, the authors optimized the process parameter ranges and provided comprehensive optimized intervals of the winding parameters. The verification test validated that the optimized intervals of the process parameters were reliable and stable for winding products manufacturing.
Yu, Tao; Kang, Chao; Zhao, Pan
2018-01-01
The composite tape winding process, which utilizes a tape winding machine and prepreg tapes, provides a promising way to improve the quality of composite products. Nevertheless, the process parameters of composite tape winding have crucial effects on the tensile strength and void content, which are closely related to the performances of the winding products. In this article, two different object values of winding products, including mechanical performance (tensile strength) and a physical property (void content), were respectively calculated. Thereafter, the paper presents an integrated methodology by combining multi-parameter relative sensitivity analysis and single-parameter sensitivity analysis to obtain the optimal intervals of the composite tape winding process. First, the global multi-parameter sensitivity analysis method was applied to investigate the sensitivity of each parameter in the tape winding processing. Then, the local single-parameter sensitivity analysis method was employed to calculate the sensitivity of a single parameter within the corresponding range. Finally, the stability and instability ranges of each parameter were distinguished. Meanwhile, the authors optimized the process parameter ranges and provided comprehensive optimized intervals of the winding parameters. The verification test validated that the optimized intervals of the process parameters were reliable and stable for winding products manufacturing. PMID:29385048
Analysis of edible oil processing options for the BIO-Plex advanced life support system
NASA Technical Reports Server (NTRS)
Greenwalt, C. J.; Hunter, J.
2000-01-01
Edible oil is a critical component of the proposed plant-based Advanced Life Support (ALS) diet. Soybean, peanut, and single-cell oil are the oil source options to date. In terrestrial manufacture, oil is ordinarily extracted with hexane, an organic solvent. However, exposed solvents are not permitted in the spacecraft environment or in enclosed human tests by National Aeronautics and Space Administration due to their potential danger and handling difficulty. As a result, alternative oil-processing methods will need to be utilized. Preparation and recovery options include traditional dehulling, crushing, conditioning, and flaking, extrusion, pressing, water extraction, and supercritical extraction. These processing options were evaluated on criteria appropriate to the Advanced Life Support System and BIO-Plex application including: product quality, product stability, waste production, risk, energy needs, labor requirements, utilization of nonrenewable resources, usefulness of by-products, and versatility and mass of equipment to determine the most appropriate ALS edible oil-processing operation.
Future of lignite resources: a life cycle analysis.
Wang, Qingsong; Liu, Wei; Yuan, Xueliang; Zheng, Xiaoning; Zuo, Jian
2016-12-01
Lignite is a low-quality energy source which accounts for 13 % of China's coal reserves. It is imperative to improve the quality of lignite for large-scale utilization. To further explore and analyze the influence of various key processes on the environment and economic costs, a lignite drying and compression technology is evaluated using an integrated approach of life cycle assessment and life cycle costs. Results showed that lignite mining, direct air emissions, and electricity consumption have most significant impacts on the environment. An integrated evaluation of life cycle assessment and life cycle costs showed that the most significant contributor to the environmental impacts and economic costs was the lignite mining process. The impact of transportation and wastewater treatment process on the environment and economic costs was small enough to be ignored. Critical factors were identified for reducing the environmental and economic impacts of lignite drying and compression technology. These findings provide useful inputs for both industrial practice and policy making for exploitation, processing, and utilization of lignite resources.
Mathematical models utilized in the retrieval of displacement information encoded in fringe patterns
NASA Astrophysics Data System (ADS)
Sciammarella, Cesar A.; Lamberti, Luciano
2016-02-01
All the techniques that measure displacements, whether in the range of visible optics or any other form of field methods, require the presence of a carrier signal. A carrier signal is a wave form modulated (modified) by an input, deformation of the medium. A carrier is tagged to the medium under analysis and deforms with the medium. The wave form must be known both in the unmodulated and the modulated conditions. There are two basic mathematical models that can be utilized to decode the information contained in the carrier, phase modulation or frequency modulation, both are closely connected. Basic problems connected to the detection and recovery of displacement information that are common to all optical techniques will be analyzed in this paper, focusing on the general theory common to all the methods independently of the type of signal utilized. The aspects discussed are those that have practical impact in the process of data gathering and data processing.
The Urban Intensive Land-use Evaluation in Xi’an, Based on Fuzzy Comprehensive Evaluation
NASA Astrophysics Data System (ADS)
Shi, Ru; Kang, Zhiyuan
2018-01-01
The intensive land-use is the basis of urban “stock optimization”, and scientific and reasonable evaluation is the important content of the land-intensive utilization. In this paper, through the survey of Xi’an urban land-use condition, we construct the suitable evaluation index system of Xi’an’ intensive land-use, by using Analytic Hierarchy Process (AHP) and Fuzzy Comprehensive Evaluation (FCE) of combination. And through the analysis of the influencing factors of land-intensive utilization, we provide a reference for the future development direction.
Brown, H J; Miller, J K; Pinchoff, D M
1975-01-01
The Information Dissemination Service at the Health Sciences Library, State University of New York at Buffalo, was established June 1970 through a three-year grant from the Lakes Area Regional Medical Program, Inc. Analysis of two samples of user request forms yielded results which significantly substantiate findings in prior biomedical literature utilization studies. The findings demonstrate comparable utilization patterns by user group, age of material, journal titles, language, time to process request, source of reference, and size of institution. PMID:1148441
Schrem, Harald; Schneider, Valentin; Kurok, Marlene; Goldis, Alon; Dreier, Maren; Kaltenborn, Alexander; Gwinner, Wilfried; Barthold, Marc; Liebeneiner, Jan; Winny, Markus; Klempnauer, Jürgen; Kleine, Moritz
2016-01-01
The aim of this study is to identify independent pre-transplant cancer risk factors after kidney transplantation and to assess the utility of G-chart analysis for clinical process control. This may contribute to the improvement of cancer surveillance processes in individual transplant centers. 1655 patients after kidney transplantation at our institution with a total of 9,425 person-years of follow-up were compared retrospectively to the general German population using site-specific standardized-incidence-ratios (SIRs) of observed malignancies. Risk-adjusted multivariable Cox regression was used to identify independent pre-transplant cancer risk factors. G-chart analysis was applied to determine relevant differences in the frequency of cancer occurrences. Cancer incidence rates were almost three times higher as compared to the matched general population (SIR = 2.75; 95%-CI: 2.33-3.21). Significantly increased SIRs were observed for renal cell carcinoma (SIR = 22.46), post-transplant lymphoproliferative disorder (SIR = 8.36), prostate cancer (SIR = 2.22), bladder cancer (SIR = 3.24), thyroid cancer (SIR = 10.13) and melanoma (SIR = 3.08). Independent pre-transplant risk factors for cancer-free survival were age <52.3 years (p = 0.007, Hazard ratio (HR): 0.82), age >62.6 years (p = 0.001, HR: 1.29), polycystic kidney disease other than autosomal dominant polycystic kidney disease (ADPKD) (p = 0.001, HR: 0.68), high body mass index in kg/m2 (p<0.001, HR: 1.04), ADPKD (p = 0.008, HR: 1.26) and diabetic nephropathy (p = 0.004, HR = 1.51). G-chart analysis identified relevant changes in the detection rates of cancer during aftercare with no significant relation to identified risk factors for cancer-free survival (p<0.05). Risk-adapted cancer surveillance combined with prospective G-chart analysis likely improves cancer surveillance schemes by adapting processes to identified risk factors and by using G-chart alarm signals to trigger Kaizen events and audits for root-cause analysis of relevant detection rate changes. Further, comparative G-chart analysis would enable benchmarking of cancer surveillance processes between centers.
Kurok, Marlene; Goldis, Alon; Dreier, Maren; Kaltenborn, Alexander; Gwinner, Wilfried; Barthold, Marc; Liebeneiner, Jan; Winny, Markus; Klempnauer, Jürgen; Kleine, Moritz
2016-01-01
Background The aim of this study is to identify independent pre-transplant cancer risk factors after kidney transplantation and to assess the utility of G-chart analysis for clinical process control. This may contribute to the improvement of cancer surveillance processes in individual transplant centers. Patients and Methods 1655 patients after kidney transplantation at our institution with a total of 9,425 person-years of follow-up were compared retrospectively to the general German population using site-specific standardized-incidence-ratios (SIRs) of observed malignancies. Risk-adjusted multivariable Cox regression was used to identify independent pre-transplant cancer risk factors. G-chart analysis was applied to determine relevant differences in the frequency of cancer occurrences. Results Cancer incidence rates were almost three times higher as compared to the matched general population (SIR = 2.75; 95%-CI: 2.33–3.21). Significantly increased SIRs were observed for renal cell carcinoma (SIR = 22.46), post-transplant lymphoproliferative disorder (SIR = 8.36), prostate cancer (SIR = 2.22), bladder cancer (SIR = 3.24), thyroid cancer (SIR = 10.13) and melanoma (SIR = 3.08). Independent pre-transplant risk factors for cancer-free survival were age <52.3 years (p = 0.007, Hazard ratio (HR): 0.82), age >62.6 years (p = 0.001, HR: 1.29), polycystic kidney disease other than autosomal dominant polycystic kidney disease (ADPKD) (p = 0.001, HR: 0.68), high body mass index in kg/m2 (p<0.001, HR: 1.04), ADPKD (p = 0.008, HR: 1.26) and diabetic nephropathy (p = 0.004, HR = 1.51). G-chart analysis identified relevant changes in the detection rates of cancer during aftercare with no significant relation to identified risk factors for cancer-free survival (p<0.05). Conclusions Risk-adapted cancer surveillance combined with prospective G-chart analysis likely improves cancer surveillance schemes by adapting processes to identified risk factors and by using G-chart alarm signals to trigger Kaizen events and audits for root-cause analysis of relevant detection rate changes. Further, comparative G-chart analysis would enable benchmarking of cancer surveillance processes between centers. PMID:27398803
2012-09-30
recognition. Algorithm design and statistical analysis and feature analysis. Post -Doctoral Associate, Cornell University, Bioacoustics Research...short. The HPC-ADA was designed based on fielded systems [1-4, 6] that offer a variety of desirable attributes, specifically dynamic resource...The software package was designed to utilize parallel and distributed processing for running recognition and other advanced algorithms. DeLMA
The Tissue Analysis Core (TAC) within the AIDS and Cancer Virus Program will process, embed, and perform microtomy on fixed tissue samples presented in ethanol. CD4 (DAB) and CD68/CD163 (FastRed) double immunohistochemistry will be performed, in whic
Study of the urban evolution of Brasilia with the use of LANDSAT data
NASA Technical Reports Server (NTRS)
Deoliveira, M. D. N. (Principal Investigator); Foresti, C.; Niero, M.; Parreiras, E. M. D. F.
1984-01-01
The urban growth of Brasilia within the last ten years is analyzed with special emphasis on the utilization of remote sensing orbital data and automatic image processing. The urban spatial structure and the monitoring of its temporal changes were focused in a whole and dynamic way by the utilization of MSS-LANDSAT images for June 1973, 1978 and 1983. In order to aid data interpretation, a registration algorithm implemented at the Interactive Multispectral Image Analysis System (IMAGE-100) was utilized aiming at the overlap of multitemporal images. The utilization of suitable digital filters, combined with the images overlap, allowed a rapid identification of areas of possible urban growth and oriented the field work. The results obtained permitted an evaluation of the urban growth of Brasilia, taking as reference the proposed stated for the construction of the city.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Higashide, Wendy; Rohlin, Lars
Easel Biotechnologies, LLC’s Bio-Oxo process has demonstrated that isobutyraldehyde can be biologically produced from corn stover hydrolysate up to 56 g/L in a 14L fermentor. This was accomplished by metabolically engineering bacterial strains to not only produce isobutyraldehyde, but to do so by co-utilizing corn stover hydrolysate sugars, glucose and xylose. Also essential to the success of the Bio-Oxo process was that it utilized gas stripping as a means of product separation, allowing for the continuous removal of isobutyraldehyde. This aided in not only reducing energy costs associated with separation, but also alleviating product toxicity, resulting in higher production. Althoughmore » we were not able to complete our economic analysis based on pilot scale fermentations, the improvements we have made from strain engineering to product separation, should result in the reduced cost of isobutyraldehyde. Still, as the project has ended prematurely, there is room for additional optimization. Improvements in productivity and sugar utilization would result in a further reduction in capital and recovery costs. As a biological-based process, the utilization of corn stover results in reduced greenhouse gas emissions as compared to petroleum-based chemical synthesis. In addition, as a true replacement chemical “drop in” system, no downstream production units need to be changed. Jobs can also be created as farm waste needs to be collected and transported to the new production facility.« less
Retrieval of radiology reports citing critical findings with disease-specific customization.
Lacson, Ronilda; Sugarbaker, Nathanael; Prevedello, Luciano M; Ivan, Ip; Mar, Wendy; Andriole, Katherine P; Khorasani, Ramin
2012-01-01
Communication of critical results from diagnostic procedures between caregivers is a Joint Commission national patient safety goal. Evaluating critical result communication often requires manual analysis of voluminous data, especially when reviewing unstructured textual results of radiologic findings. Information retrieval (IR) tools can facilitate this process by enabling automated retrieval of radiology reports that cite critical imaging findings. However, IR tools that have been developed for one disease or imaging modality often need substantial reconfiguration before they can be utilized for another disease entity. THIS PAPER: 1) describes the process of customizing two Natural Language Processing (NLP) and Information Retrieval/Extraction applications - an open-source toolkit, A Nearly New Information Extraction system (ANNIE); and an application developed in-house, Information for Searching Content with an Ontology-Utilizing Toolkit (iSCOUT) - to illustrate the varying levels of customization required for different disease entities and; 2) evaluates each application's performance in identifying and retrieving radiology reports citing critical imaging findings for three distinct diseases, pulmonary nodule, pneumothorax, and pulmonary embolus. Both applications can be utilized for retrieval. iSCOUT and ANNIE had precision values between 0.90-0.98 and recall values between 0.79 and 0.94. ANNIE had consistently higher precision but required more customization. Understanding the customizations involved in utilizing NLP applications for various diseases will enable users to select the most suitable tool for specific tasks.
Retrieval of Radiology Reports Citing Critical Findings with Disease-Specific Customization
Lacson, Ronilda; Sugarbaker, Nathanael; Prevedello, Luciano M; Ivan, IP; Mar, Wendy; Andriole, Katherine P; Khorasani, Ramin
2012-01-01
Background: Communication of critical results from diagnostic procedures between caregivers is a Joint Commission national patient safety goal. Evaluating critical result communication often requires manual analysis of voluminous data, especially when reviewing unstructured textual results of radiologic findings. Information retrieval (IR) tools can facilitate this process by enabling automated retrieval of radiology reports that cite critical imaging findings. However, IR tools that have been developed for one disease or imaging modality often need substantial reconfiguration before they can be utilized for another disease entity. Purpose: This paper: 1) describes the process of customizing two Natural Language Processing (NLP) and Information Retrieval/Extraction applications – an open-source toolkit, A Nearly New Information Extraction system (ANNIE); and an application developed in-house, Information for Searching Content with an Ontology-Utilizing Toolkit (iSCOUT) – to illustrate the varying levels of customization required for different disease entities and; 2) evaluates each application’s performance in identifying and retrieving radiology reports citing critical imaging findings for three distinct diseases, pulmonary nodule, pneumothorax, and pulmonary embolus. Results: Both applications can be utilized for retrieval. iSCOUT and ANNIE had precision values between 0.90-0.98 and recall values between 0.79 and 0.94. ANNIE had consistently higher precision but required more customization. Conclusion: Understanding the customizations involved in utilizing NLP applications for various diseases will enable users to select the most suitable tool for specific tasks. PMID:22934127
Contact dynamics recording and analysis system using an optical fiber sensor approach
NASA Astrophysics Data System (ADS)
Anghel, F.; Pavelescu, D.; Grattan, K. T. V.; Palmer, A. W.
1997-09-01
A contact dynamics recording and analysis system configured using an optical fiber sensor has been developed having been designed with a particular application to the accurate and time-varying description of moving contact operating during electrical arc breaking, in an experimental platform simulating the operation of a vacuum circuit breaker. The system utilizes dynamic displacement measurement and data recording and a post-process data analysis to reveal the dynamic speed and acceleration data of the equipment.
Automated Drug Identification for Urban Hospitals
NASA Technical Reports Server (NTRS)
Shirley, Donna L.
1971-01-01
Many urban hospitals are becoming overloaded with drug abuse cases requiring chemical analysis for identification of drugs. In this paper, the requirements for chemical analysis of body fluids for drugs are determined and a system model for automated drug analysis is selected. The system as modeled, would perform chemical preparation of samples, gas-liquid chromatographic separation of drugs in the chemically prepared samples, infrared spectrophotometric analysis of the drugs, and would utilize automatic data processing and control for drug identification. Requirements of cost, maintainability, reliability, flexibility, and operability are considered.
Analysis of a document/reporting system
NASA Technical Reports Server (NTRS)
Narrow, B.
1971-01-01
An in-depth analysis of the information system within the Data Processing Branch is presented. Quantitative measures are used to evaluate the efficiency and effectiveness of the information system. It is believed that this is the first documented study which utilizes quantitative measures for full scale system analysis. The quantitative measures and techniques for collecting and qualifying the basic data, as described, are applicable to any information system. Therefore this report is considered to be of interest to any persons concerned with the management design, analysis or evaluation of information systems.
Development of a New VLBI Data Analysis Software
NASA Technical Reports Server (NTRS)
Bolotin, Sergei; Gipson, John M.; MacMillan, Daniel S.
2010-01-01
We present an overview of a new VLBI analysis software under development at NASA GSFC. The new software will replace CALC/SOLVE and many related utility programs. It will have the capabilities of the current system as well as incorporate new models and data analysis techniques. In this paper we give a conceptual overview of the new software. We formulate the main goals of the software. The software should be flexible and modular to implement models and estimation techniques that currently exist or will appear in future. On the other hand it should be reliable and possess production quality for processing standard VLBI sessions. Also, it needs to be capable of processing observations from a fully deployed network of VLBI2010 stations in a reasonable time. We describe the software development process and outline the software architecture.
Fast Edge Detection and Segmentation of Terrestrial Laser Scans Through Normal Variation Analysis
NASA Astrophysics Data System (ADS)
Che, E.; Olsen, M. J.
2017-09-01
Terrestrial Laser Scanning (TLS) utilizes light detection and ranging (lidar) to effectively and efficiently acquire point cloud data for a wide variety of applications. Segmentation is a common procedure of post-processing to group the point cloud into a number of clusters to simplify the data for the sequential modelling and analysis needed for most applications. This paper presents a novel method to rapidly segment TLS data based on edge detection and region growing. First, by computing the projected incidence angles and performing the normal variation analysis, the silhouette edges and intersection edges are separated from the smooth surfaces. Then a modified region growing algorithm groups the points lying on the same smooth surface. The proposed method efficiently exploits the gridded scan pattern utilized during acquisition of TLS data from most sensors and takes advantage of parallel programming to process approximately 1 million points per second. Moreover, the proposed segmentation does not require estimation of the normal at each point, which limits the errors in normal estimation propagating to segmentation. Both an indoor and outdoor scene are used for an experiment to demonstrate and discuss the effectiveness and robustness of the proposed segmentation method.
Perceived Insider Status and Feedback Reactions: A Dual Path of Feedback Motivation Attribution.
Chen, Xiao; Liao, JianQiao; Wu, Weijiong; Zhang, Wei
2017-01-01
Many studies have evaluated how the characteristics of feedback receiver, feedback deliverer and feedback information influence psychological feedback reactions of the feedback receiver while largely neglecting that feedback intervention is a kind of social interaction process. To address this issue, this study proposes that employees' perceived insider status (PIS), as a kind of employee-organization relationship, could also influence employees' reactions to supervisory feedback. In particular, this study investigates the influence of PIS focusing on affective and cognitive feedback reactions, namely feedback satisfaction and feedback utility. Surveys were conducted in a machinery manufacturing company in the Guangdong province of China. Samples were collected from 192 employees. Data analysis demonstrated that PIS and feedback utility possessed a U-shaped relationship, whereas PIS and feedback satisfaction exhibited positively linear relationships. The analysis identified two kinds of mediating mechanisms related to feedback satisfaction and feedback utility. Internal feedback motivation attribution partially mediated the relationship between PIS and feedback satisfaction but failed to do the same with respect to the relationship between PIS and feedback utility. In contrast, external feedback motivation attribution partially mediated the relationship between PIS and feedback utility while failing to mediate the relationship between PIS and feedback satisfaction. Theoretical contributions and practical implications of the findings are discussed at the end of the paper.
Perceived Insider Status and Feedback Reactions: A Dual Path of Feedback Motivation Attribution
Chen, Xiao; Liao, JianQiao; Wu, Weijiong; Zhang, Wei
2017-01-01
Many studies have evaluated how the characteristics of feedback receiver, feedback deliverer and feedback information influence psychological feedback reactions of the feedback receiver while largely neglecting that feedback intervention is a kind of social interaction process. To address this issue, this study proposes that employees’ perceived insider status (PIS), as a kind of employee-organization relationship, could also influence employees’ reactions to supervisory feedback. In particular, this study investigates the influence of PIS focusing on affective and cognitive feedback reactions, namely feedback satisfaction and feedback utility. Surveys were conducted in a machinery manufacturing company in the Guangdong province of China. Samples were collected from 192 employees. Data analysis demonstrated that PIS and feedback utility possessed a U-shaped relationship, whereas PIS and feedback satisfaction exhibited positively linear relationships. The analysis identified two kinds of mediating mechanisms related to feedback satisfaction and feedback utility. Internal feedback motivation attribution partially mediated the relationship between PIS and feedback satisfaction but failed to do the same with respect to the relationship between PIS and feedback utility. In contrast, external feedback motivation attribution partially mediated the relationship between PIS and feedback utility while failing to mediate the relationship between PIS and feedback satisfaction. Theoretical contributions and practical implications of the findings are discussed at the end of the paper. PMID:28507527
Utility of action checklists as a consensus building tool
KIM, Yeon-Ha; YOSHIKAWA, Etsuko; YOSHIKAWA, Toru; KOGI, Kazutaka; JUNG, Moon-Hee
2014-01-01
The present study’s objective was to determine the mechanisms for enhancing the utility of action checklists applied in participatory approach programs for workplace improvements, to identify the benefits of building consensus and to compare their applicability in Asian countries to find the most appropriate configuration for action checklists. Data were collected from eight trainees and 43 trainers with experience in Participatory Action-Oriented Training. Statistical analysis was performed in SPSS using the package PASW, version 19.0. The difference in the mean score for the degree of the utility of action checklists between countries was analyzed using ANOVA methods. Factor analysis was performed to validate the action checklists’ utility. Pearson Correlation Coefficients were then calculated to determine the direction and strength of the relationship between these factors. Using responses obtained from trainees’ in-depth interviews, we identified 33 key statements that were then classified into 11 thematic clusters. Five factors were extracted, namely “ease of application”, “practical solutions”, “group interaction”, “multifaceted perspective” and “active involvement”. The action checklist was useful for facilitating a participatory process among trainees and trainers for improving working conditions. Action checklists showed similar patterns of utility in various Asian countries; particularly when adjusted to local conditions. PMID:25224334
Internal Labor Markets: An Empirical Investigation.
ERIC Educational Resources Information Center
Mahoney, Thomas A.; Milkovich, George T.
Methods of internal labor market analysis for three organizational areas are presented, along with some evidence about the validity and utility of conceptual descriptions of such markets. The general concept of an internal labor market refers to the process of pricing and allocation of manpower resources with an employing organization and rests…
An Instructional Approach to Modeling in Microevolution.
ERIC Educational Resources Information Center
Thompson, Steven R.
1988-01-01
Describes an approach to teaching population genetics and evolution and some of the ways models can be used to enhance understanding of the processes being studied. Discusses the instructional plan, and the use of models including utility programs and analysis with models. Provided are a basic program and sample program outputs. (CW)
The Rhetoric of Study Abroad: Perpetuating Expectations and Results through Technological Enframing
ERIC Educational Resources Information Center
Bishop, Sarah C.
2013-01-01
This analysis examines the preparatory and reflective online rhetoric available to potential and past academic travelers at the university level. Utilizing Martin Heidegger's (1977) notion of the ways in which technological processes "enframe" human experiences, the article scrutinizes the visual and verbal rhetoric found on the websites…
ANALYSIS AND EVALUATION OF MYCELIUM REINFORCED NATURAL FIBER BIO-COMPOSITES
USDA-ARS?s Scientific Manuscript database
There is a need for biodegradable alternatives to the inert plastics and expanded foams that are common in both the manufacturing process and device componentry. The material in this study is a bio-composite patented by Ecovative Design LLC. The manufacturer's bio-composite utilizes fungal mycelium ...
Code of Federal Regulations, 2014 CFR
2014-01-01
... emergency power to instruments, utility service systems, and operating systems important to safety if there... include: (a) A general description of the structures, systems, components, equipment, and process... of the performance of the structures, systems, and components to identify those that are important to...
Code of Federal Regulations, 2013 CFR
2013-01-01
... emergency power to instruments, utility service systems, and operating systems important to safety if there... include: (a) A general description of the structures, systems, components, equipment, and process... of the performance of the structures, systems, and components to identify those that are important to...
Code of Federal Regulations, 2012 CFR
2012-01-01
... emergency power to instruments, utility service systems, and operating systems important to safety if there... include: (a) A general description of the structures, systems, components, equipment, and process... of the performance of the structures, systems, and components to identify those that are important to...
Code of Federal Regulations, 2011 CFR
2011-01-01
... emergency power to instruments, utility service systems, and operating systems important to safety if there... include: (a) A general description of the structures, systems, components, equipment, and process... of the performance of the structures, systems, and components to identify those that are important to...
New Pathways for Teaching Chemistry: Reflective Judgment in Science.
ERIC Educational Resources Information Center
Finster, David C.
1992-01-01
The reflective judgment model offers a rich context for analysis of science and science teaching. It provides deeper understanding of the scientific process and its critical thinking and reveals fundamental connections between science and the other liberal arts. Classroom techniques from a college chemistry course illustrate the utility of the…
Learning through Accreditation: Faculty Reflections on the Experience of Program Evaluation
ERIC Educational Resources Information Center
Garrison, Sarah; Herring, Angel; Hinton, W. Jeff
2013-01-01
This qualitative study was conducted to explore the personal and professional experiences of family and consumer sciences educators (n = 3) who recently participated in the AAFCS accreditation process utilizing the 2010 Accreditation standards. Analysis of the transcribed semi-structured interview data yielded four overarching categories: (a)…
Travel Agent. Occupational Simulation Kit.
ERIC Educational Resources Information Center
Peterson, Wayne
This career exploration instructional booklet on the travel agent's occupation is one of several resulting from the rural southwestern Colorado CEPAC Project (Career Education Process of Attitude Change). Based on a job analysis and utilizing a programed instructional format, the following content is included: A brief description of what a travel…
Tailoring the Interview Process for More Effective Personnel Selection.
ERIC Educational Resources Information Center
Saville, Anthony
Structuring the initial teacher employment interview adds validity to selection and appropriately utilizes human resources. Five aspects of an effective interview program include: (1) developing a job analysis plan; (2) reviewing the applications; (3) planning for the interview; (4) the interview instrument; and (5) legal implications. An…
Educational Computer Utilization and Computer Communications.
ERIC Educational Resources Information Center
Singh, Jai P.; Morgan, Robert P.
As part of an analysis of educational needs and telecommunications requirements for future educational satellite systems, three studies were carried out. 1) The role of the computer in education was examined and both current status and future requirements were analyzed. Trade-offs between remote time sharing and remote batch process were explored…
Social Worker. Occupational Simulation Kit.
ERIC Educational Resources Information Center
Brandt, Joy
This career exploration instructional booklet on the occupation of the social worker is one of several resulting from the rural southwestern Colorado CEPAC Project (Career Education Process of Attitude Change). Based on a job analysis and utilizing a programed instructional format, the following content is included: A brief description of what a…
An Analysis of the Effect of Mobile Learning on Lebanese Higher Education
ERIC Educational Resources Information Center
Jabbour, Khayrazad Kari
2014-01-01
This research explores the effect of mobile technology in Lebanese higher education classrooms. Three components were utilized to evaluate the impact: student attitudes, student achievements, and educational process. This study used both quantitative and qualitative methods to examine the research questions. The main sources for data collection…
Research in remote sensing of agriculture, earth resources, and man's environment
NASA Technical Reports Server (NTRS)
Landgrebe, D. A.
1975-01-01
Progress is reported for several projects involving the utilization of LANDSAT remote sensing capabilities. Areas under study include crop inventory, crop identification, crop yield prediction, forest resources evaluation, land resources evaluation and soil classification. Numerical methods for image processing are discussed, particularly those for image enhancement and analysis.
Sadowski, Franklin G.; Covington, Steven J.
1987-01-01
Advanced digital processing techniques were applied to Landsat-5 Thematic Mapper (TM) data and SPOT highresolution visible (HRV) panchromatic data to maximize the utility of images of a nuclear powerplant emergency at Chernobyl in the Soviet Ukraine. The images demonstrate the unique interpretive capabilities provided by the numerous spectral bands of the Thematic Mapper and the high spatial resolution of the SPOT HRV sensor.
Intuition and nursing practice implications for nurse educators: a review of the literature.
Correnti, D
1992-01-01
Intuitive knowledge is an essential component of the art of nursing and of the nursing process. This article provides an analysis and review of the literature on intuition. The author addresses the use of intuition in nursing science, characteristics of intuitive nurses, receptivity of intuitive knowledge, and the importance of expanding nursing's utilization of the intuitive process. Strategies are provided for promoting intuitive skills in continuing education/staff development settings.
Automation of the aircraft design process
NASA Technical Reports Server (NTRS)
Heldenfels, R. R.
1974-01-01
The increasing use of the computer to automate the aerospace product development and engineering process is examined with emphasis on structural analysis and design. Examples of systems of computer programs in aerospace and other industries are reviewed and related to the characteristics of aircraft design in its conceptual, preliminary, and detailed phases. Problems with current procedures are identified, and potential improvements from optimum utilization of integrated disciplinary computer programs by a man/computer team are indicated.
Transcriptional Regulatory Network Analysis of MYB Transcription Factor Family Genes in Rice.
Smita, Shuchi; Katiyar, Amit; Chinnusamy, Viswanathan; Pandey, Dev M; Bansal, Kailash C
2015-01-01
MYB transcription factor (TF) is one of the largest TF families and regulates defense responses to various stresses, hormone signaling as well as many metabolic and developmental processes in plants. Understanding these regulatory hierarchies of gene expression networks in response to developmental and environmental cues is a major challenge due to the complex interactions between the genetic elements. Correlation analyses are useful to unravel co-regulated gene pairs governing biological process as well as identification of new candidate hub genes in response to these complex processes. High throughput expression profiling data are highly useful for construction of co-expression networks. In the present study, we utilized transcriptome data for comprehensive regulatory network studies of MYB TFs by "top-down" and "guide-gene" approaches. More than 50% of OsMYBs were strongly correlated under 50 experimental conditions with 51 hub genes via "top-down" approach. Further, clusters were identified using Markov Clustering (MCL). To maximize the clustering performance, parameter evaluation of the MCL inflation score (I) was performed in terms of enriched GO categories by measuring F-score. Comparison of co-expressed cluster and clads analyzed from phylogenetic analysis signifies their evolutionarily conserved co-regulatory role. We utilized compendium of known interaction and biological role with Gene Ontology enrichment analysis to hypothesize function of coexpressed OsMYBs. In the other part, the transcriptional regulatory network analysis by "guide-gene" approach revealed 40 putative targets of 26 OsMYB TF hubs with high correlation value utilizing 815 microarray data. The putative targets with MYB-binding cis-elements enrichment in their promoter region, functional co-occurrence as well as nuclear localization supports our finding. Specially, enrichment of MYB binding regions involved in drought-inducibility implying their regulatory role in drought response in rice. Thus, the co-regulatory network analysis facilitated the identification of complex OsMYB regulatory networks, and candidate target regulon genes of selected guide MYB genes. The results contribute to the candidate gene screening, and experimentally testable hypotheses for potential regulatory MYB TFs, and their targets under stress conditions.
Impact of HMO ownership on management processes and utilization outcomes.
Ahern, M; Molinari, C
2001-05-01
To examine the effects of health maintenance organization (HMO) ownership characteristics on selected utilization outcomes and management processes affecting utilization. We used 1995 HMO data from the American Association of Health Plans. Using regression analysis, we examined the relation between HMO utilization (hospital discharges, days, and average length of stay; cardiac catheterization procedures; and average cost of outpatient prescriptions) and the structural characteristics of HMOs: ownership type (insurance company, hospital, physician, independent, and national managed care company), HMO size, for-profit status, model type, geographic region, and payer mix. HMO ownership type is significantly associated with medical management processes, including risk sharing by providers, risk sharing by consumers, and other management strategies. Relative to hospital-owned HMOs, insurance company-owned HMOs have fewer hospital discharges, fewer hospital days, and longer lengths of stay. National managed care organization-owned HMOs have fewer cardiac catheterizations and lower average outpatient prescription costs. Independently owned HMOs have more cardiac catheterizations. For-profit HMOs have lower prescription costs. Relative to hospital-owned HMOs, insurance company-owned HMOs are more likely to use hospital risk sharing and provider capitation and less likely to use out-of-pocket payments for hospital use and a closed formulary. National managed care organization-owned HMOs are less likely to use provider capitation, out-of-pocket payments for hospital use, catastrophic case management, and hospital risk sharing. Physician-hospital-owned HMOs are less likely to use catastrophic case management. For-profit HMOs are more likely to use hospital risk sharing and catastrophic case management. HMO ownership type affects utilization outcomes and management strategies.
Commissioning of a CERN Production and Analysis Facility Based on xrootd
NASA Astrophysics Data System (ADS)
Campana, Simone; van der Ster, Daniel C.; Di Girolamo, Alessandro; Peters, Andreas J.; Duellmann, Dirk; Coelho Dos Santos, Miguel; Iven, Jan; Bell, Tim
2011-12-01
The CERN facility hosts the Tier-0 of the four LHC experiments, but as part of WLCG it also offers a platform for production activities and user analysis. The CERN CASTOR storage technology has been extensively tested and utilized for LHC data recording and exporting to external sites according to experiments computing model. On the other hand, to accommodate Grid data processing activities and, more importantly, chaotic user analysis, it was realized that additional functionality was needed including a different throttling mechanism for file access. This paper will describe the xroot-based CERN production and analysis facility for the ATLAS experiment and in particular the experiment use case and data access scenario, the xrootd redirector setup on top of the CASTOR storage system, the commissioning of the system and real life experience for data processing and data analysis.
A retrospective analysis of the change in anti-malarial treatment policy: Peru.
Williams, Holly Ann; Vincent-Mark, Arlene; Herrera, Yenni; Chang, O Jaime
2009-04-28
National malaria control programmes must deal with the complex process of changing national malaria treatment guidelines, often without guidance on the process of change. Selecting a replacement drug is only one issue in this process. There is a paucity of literature describing successful malaria treatment policy changes to help guide control programs through this process. To understand the wider context in which national malaria treatment guidelines were formulated in a specific country (Peru). Using qualitative methods (individual and focus group interviews, stakeholder analysis and a review of documents), a retrospective analysis of the process of change in Peru's anti-malarial treatment policy from the early 1990's to 2003 was completed. The decision to change Peru's policies resulted from increasing levels of anti-malarial drug resistance, as well as complaints from providers that the drugs were no longer working. The context of the change occurred in a time in which Peru was changing national governments, which created extreme challenges in moving the change process forward. Peru utilized a number of key strategies successfully to ensure that policy change would occur. This included a) having the process directed by a group who shared a common interest in malaria and who had long-established social and professional networks among themselves, b) engaging in collaborative teamwork among nationals and between nationals and international collaborators, c) respect for and inclusion of district-level staff in all phases of the process, d) reliance on high levels of technical and scientific knowledge, e) use of standardized protocols to collect data, and f) transparency. Although not perfectly or fully implemented by 2003, the change in malaria treatment policy in Peru occurred very quickly, as compared to other countries. They identified a problem, collected the data necessary to justify the change, utilized political will to their favor, approved the policy, and moved to improve malaria control in their country. As such, they offer an excellent example for other countries as they contemplate or embark on policy changes.
Risk Assessment on Constructors during Over-water Riprap Based on Entropy Weight and FAHP
NASA Astrophysics Data System (ADS)
Wu, Tongqing; Li, Liang; Liang, Zelong; Mao, Tian; Shao, Weifeng
2017-07-01
Being aimed at waterway regulation engineering, there exist risks of over-water riprap for constructors which keeps uncertainty and complexity. For the purpose of evaluating the possibility and consequence, this paper utilizes fuzzy analytic hierarchy process with abbreviation of FAHP to do empowerment on the related risk indicators, constructs FAHP under entropy weight and establishes relevant evaluation factor set and evaluation language for constructors during over-water riprap construction process. Through doing risk probability estimation and risk consequence size evaluation on the factor of constructors, this paper introduces this model into risk analysis on constructors during over-water riprap of Ching River waterway regulation project. Results show that evaluation of this method is so credible that it could be utilized in practical engineering.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garling, W.S.; Harper, M.R.; Merchant-Geuder, L.
1980-03-01
Potential applications of wind energy include not only large central turbines that can be utilized by utilities, but also dispersed systems for farms and other applications. The US Departments of Energy (DOE) and Agriculture (USDA) currently are establishing the feasibility of wind energy use in applications where the energy can be used as available, or stored in a simple form. These applications include production of hot water for rural sanitation, heating and cooling of rural structures and products, drying agricultural products, and irrigation. This study, funded by USDA, analyzed the economic feasibility of wind power in refrigeration cooling and watermore » heating systems in food processing plants. Types of plants included were meat and poultry, dairy, fruit and vegetable, and aquaculture.« less
Value of the distant future: Model-independent results
NASA Astrophysics Data System (ADS)
Katz, Yuri A.
2017-01-01
This paper shows that the model-independent account of correlations in an interest rate process or a log-consumption growth process leads to declining long-term tails of discount curves. Under the assumption of an exponentially decaying memory in fluctuations of risk-free real interest rates, I derive the analytical expression for an apt value of the long run discount factor and provide a detailed comparison of the obtained result with the outcome of the benchmark risk-free interest rate models. Utilizing the standard consumption-based model with an isoelastic power utility of the representative economic agent, I derive the non-Markovian generalization of the Ramsey discounting formula. Obtained analytical results allowing simple calibration, may augment the rigorous cost-benefit and regulatory impact analysis of long-term environmental and infrastructure projects.
Implementation of a partitioned algorithm for simulation of large CSI problems
NASA Technical Reports Server (NTRS)
Alvin, Kenneth F.; Park, K. C.
1991-01-01
The implementation of a partitioned numerical algorithm for determining the dynamic response of coupled structure/controller/estimator finite-dimensional systems is reviewed. The partitioned approach leads to a set of coupled first and second-order linear differential equations which are numerically integrated with extrapolation and implicit step methods. The present software implementation, ACSIS, utilizes parallel processing techniques at various levels to optimize performance on a shared-memory concurrent/vector processing system. A general procedure for the design of controller and filter gains is also implemented, which utilizes the vibration characteristics of the structure to be solved. Also presented are: example problems; a user's guide to the software; the procedures and algorithm scripts; a stability analysis for the algorithm; and the source code for the parallel implementation.
Health Monitoring System Technology Assessments: Cost Benefits Analysis
NASA Technical Reports Server (NTRS)
Kent, Renee M.; Murphy, Dennis A.
2000-01-01
The subject of sensor-based structural health monitoring is very diverse and encompasses a wide range of activities including initiatives and innovations involving the development of advanced sensor, signal processing, data analysis, and actuation and control technologies. In addition, it embraces the consideration of the availability of low-cost, high-quality contributing technologies, computational utilities, and hardware and software resources that enable the operational realization of robust health monitoring technologies. This report presents a detailed analysis of the cost benefit and other logistics and operational considerations associated with the implementation and utilization of sensor-based technologies for use in aerospace structure health monitoring. The scope of this volume is to assess the economic impact, from an end-user perspective, implementation health monitoring technologies on three structures. It specifically focuses on evaluating the impact on maintaining and supporting these structures with and without health monitoring capability.
The Influence of Unsteadiness on the Analysis of Pressure Gain Combustion Devices
NASA Technical Reports Server (NTRS)
Paxson, Daniel E.; Kaemming, Tom
2013-01-01
Pressure gain combustion (PGC) has been the object of scientific study for over a century due to its promise of improved thermodynamic efficiency. In many recent application concepts PGC is utilized as a component in an otherwise continuous, normally steady flow system, such as a gas turbine or ram jet engine. However, PGC is inherently unsteady. Failure to account for the effects of this periodic unsteadiness can lead to misunderstanding and errors in performance calculations. This paper seeks to provide some clarity by presenting a consistent method of thermodynamic cycle analysis for a device utilizing PGC technology. The incorporation of the unsteady PGC process into the conservation equations for a continuous flow device is presented. Most importantly, the appropriate method for computing the conservation of momentum is presented. It will be shown that proper, consistent analysis of cyclic conservation principles produces representative performance predictions.
TASI: A software tool for spatial-temporal quantification of tumor spheroid dynamics.
Hou, Yue; Konen, Jessica; Brat, Daniel J; Marcus, Adam I; Cooper, Lee A D
2018-05-08
Spheroid cultures derived from explanted cancer specimens are an increasingly utilized resource for studying complex biological processes like tumor cell invasion and metastasis, representing an important bridge between the simplicity and practicality of 2-dimensional monolayer cultures and the complexity and realism of in vivo animal models. Temporal imaging of spheroids can capture the dynamics of cell behaviors and microenvironments, and when combined with quantitative image analysis methods, enables deep interrogation of biological mechanisms. This paper presents a comprehensive open-source software framework for Temporal Analysis of Spheroid Imaging (TASI) that allows investigators to objectively characterize spheroid growth and invasion dynamics. TASI performs spatiotemporal segmentation of spheroid cultures, extraction of features describing spheroid morpho-phenotypes, mathematical modeling of spheroid dynamics, and statistical comparisons of experimental conditions. We demonstrate the utility of this tool in an analysis of non-small cell lung cancer spheroids that exhibit variability in metastatic and proliferative behaviors.
NASA Technical Reports Server (NTRS)
Corban, Robert
1993-01-01
The systems engineering process for the concept definition phase of the program involves requirements definition, system definition, and consistent concept definition. The requirements definition process involves obtaining a complete understanding of the system requirements based on customer needs, mission scenarios, and nuclear thermal propulsion (NTP) operating characteristics. A system functional analysis is performed to provide a comprehensive traceability and verification of top-level requirements down to detailed system specifications and provides significant insight into the measures of system effectiveness to be utilized in system evaluation. The second key element in the process is the definition of system concepts to meet the requirements. This part of the process involves engine system and reactor contractor teams to develop alternative NTP system concepts that can be evaluated against specific attributes, as well as a reference configuration against which to compare system benefits and merits. Quality function deployment (QFD), as an excellent tool within Total Quality Management (TQM) techniques, can provide the required structure and provide a link to the voice of the customer in establishing critical system qualities and their relationships. The third element of the process is the consistent performance comparison. The comparison process involves validating developed concept data and quantifying system merits through analysis, computer modeling, simulation, and rapid prototyping of the proposed high risk NTP subsystems. The maximum amount possible of quantitative data will be developed and/or validated to be utilized in the QFD evaluation matrix. If upon evaluation of a new concept or its associated subsystems determine to have substantial merit, those features will be incorporated into the reference configuration for subsequent system definition and comparison efforts.
Determinants of job stress in chemical process industry: A factor analysis approach.
Menon, Balagopal G; Praveensal, C J; Madhu, G
2015-01-01
Job stress is one of the active research domains in industrial safety research. The job stress can result in accidents and health related issues in workers in chemical process industries. Hence it is important to measure the level of job stress in workers so as to mitigate the same to avoid the worker's safety related problems in the industries. The objective of this study is to determine the job stress factors in the chemical process industry in Kerala state, India. This study also aims to propose a comprehensive model and an instrument framework for measuring job stress levels in the chemical process industries in Kerala, India. The data is collected through a questionnaire survey conducted in chemical process industries in Kerala. The collected data out of 1197 surveys is subjected to principal component and confirmatory factor analysis to develop the job stress factor structure. The factor analysis revealed 8 factors that influence the job stress in process industries. It is also found that the job stress in employees is most influenced by role ambiguity and the least by work environment. The study has developed an instrument framework towards measuring job stress utilizing exploratory factor analysis and structural equation modeling.
Genetically tunable M13 phage films utilizing evaporating droplets.
Alberts, Erik; Warner, Chris; Barnes, Eftihia; Pilkiewicz, Kevin; Perkins, Edward; Poda, Aimee
2018-01-01
This effort utilizes a genetically tunable system of bacteriophage to evaluate the effect of charge, temperature and particle concentration on biomaterial synthesis utilizing the coffee ring (CR) effect. There was a 1.6-3 fold suppression of the CR at higher temperatures while maintaining self-assembled structures of thin films. This suppression was observed in phage with charged and uncharged surface chemistry, which formed ordered and disordered assemblies respectively, indicating CR suppression is not dependent on short-range ordering or surface chemistry. Analysis of the drying process suggests weakened capillary flow at elevated temperatures caused CR suppression and could be further enhanced for controlled assembly for advanced biomaterials. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
Interconnecting PV on New York City's Secondary Network Distribution System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, K; Coddington, M; Burman, K
2009-11-01
The U.S. Department of Energy (DOE) has teamed with cities across the country through the Solar America Cities (SAC) partnership program to help reduce barriers and accelerate implementation of solar energy. The New York City SAC team is a partnership between the City University of New York (CUNY), the New York City Mayor s Office of Long-term Planning and Sustainability, and the New York City Economic Development Corporation (NYCEDC).The New York City SAC team is working with DOE s National Renewable Energy Laboratory (NREL) and Con Edison, the local utility, to develop a roadmap for photovoltaic (PV) installations in themore » five boroughs. The city set a goal to increase its installed PV capacity from1.1 MW in 2005 to 8.1 MW by 2015 (the maximum allowed in 2005). A key barrier to reaching this goal, however, is the complexity of the interconnection process with the local utility. Unique challenges are associated with connecting distributed PV systems to secondary network distribution systems (simplified to networks in this report). Although most areas of the country use simpler radial distribution systems to distribute electricity, larger metropolitan areas like New York City typically use networks to increase reliability in large load centers. Unlike the radial distribution system, where each customer receives power through a single line, a network uses a grid of interconnected lines to deliver power to each customer through several parallel circuits and sources. This redundancy improves reliability, but it also requires more complicated coordination and protection schemes that can be disrupted by energy exported from distributed PV systems. Currently, Con Edison studies each potential PV system in New York City to evaluate the system s impact on the network, but this is time consuming for utility engineers and may delay the customer s project or add cost for larger installations. City leaders would like to streamline this process to facilitate faster, simpler, and less expensive distributed PV system interconnections. To assess ways to improve the interconnection process, NREL conducted a four-part study with support from DOE. The NREL team then compiled the final reports from each study into this report. In Section 1PV Deployment Analysis for New York City we analyze the technical potential for rooftop PV systems in the city. This analysis evaluates potential PV power production in ten Con Edison networks of various locations and building densities (ranging from high density apartments to lower density single family homes). Next, we compare the potential power production to network loads to determine where and when PV generation is most likely to exceed network load and disrupt network protection schemes. The results of this analysis may assist Con Edison in evaluating future PV interconnection applications and in planning future network protection system upgrades. This analysis may also assist other utilities interconnecting PV systems to networks by defining a method for assessing the technical potential of PV in the network and its impact on network loads. Section 2. A Briefing for Policy Makers on Connecting PV to a Network Grid presents an overview intended for nontechnical stakeholders. This section describes the issues associated with interconnecting PV systems to networks, along with possible solutions. Section 3. Technical Review of Concerns and Solutions to PV Interconnection in New York City summarizes common concerns of utility engineers and network experts about interconnecting PV systems to secondary networks. This section also contains detailed descriptions of nine solutions, including advantages and disadvantages, potential impacts, and road maps for deployment. Section 4. Utility Application Process Reviewlooks at utility interconnection application processes across the country and identifies administrative best practices for efficient PV interconnection.« less
[Critical thinking skills in the nursing diagnosis process].
Bittencourt, Greicy Kelly Gouveia Dias; Crossetti, Maria da Graça Oliveira
2013-04-01
The aim of this study was to identify the critical thinking skills utilized in the nursing diagnosis process. This was an exploratory descriptive study conducted with seven nursing students on the application of a clinical case to identify critical thinking skills, as well as their justifications in the nursing diagnosis process. Content analysis was performed to evaluate descriptive data. Six participants reported that analysis, scientific and technical knowledge and logical reasoning skills are important in identifying priority nursing diagnoses; clinical experience was cited by five participants, knowledge about the patient and application of standards were mentioned by three participants; Furthermore, discernment and contextual perspective were skills noted by two participants. Based on these results, the use of critical thinking skills related to the steps of the nursing diagnosis process was observed. Therefore, that the application of this process may constitute a strategy that enables the development of critical thinking skills.
NASA Astrophysics Data System (ADS)
Hananto, R. B.; Kusmayadi, T. A.; Riyadi
2018-05-01
The research aims to identify the critical thinking process of students in solving geometry problems. The geometry problem selected in this study was the building of flat side room (cube). The critical thinking process was implemented to visual, auditory and kinesthetic learning styles. This research was a descriptive analysis research using qualitative method. The subjects of this research were 3 students selected by purposive sampling consisting of visual, auditory, and kinesthetic learning styles. Data collection was done through test, interview, and observation. The results showed that the students' critical thinking process in identifying and defining steps for each learning style were similar in solving problems. The critical thinking differences were seen in enumerate, analyze, list, and self-correct steps. It was also found that critical thinking process of students with kinesthetic learning style was better than visual and auditory learning styles.
Smith, Andrew M; Wells, Gary L; Lindsay, R C L; Penrod, Steven D
2017-04-01
Receiver Operating Characteristic (ROC) analysis has recently come in vogue for assessing the underlying discriminability and the applied utility of lineup procedures. Two primary assumptions underlie recommendations that ROC analysis be used to assess the applied utility of lineup procedures: (a) ROC analysis of lineups measures underlying discriminability, and (b) the procedure that produces superior underlying discriminability produces superior applied utility. These same assumptions underlie a recently derived diagnostic-feature detection theory, a theory of discriminability, intended to explain recent patterns observed in ROC comparisons of lineups. We demonstrate, however, that these assumptions are incorrect when ROC analysis is applied to lineups. We also demonstrate that a structural phenomenon of lineups, differential filler siphoning, and not the psychological phenomenon of diagnostic-feature detection, explains why lineups are superior to showups and why fair lineups are superior to biased lineups. In the process of our proofs, we show that computational simulations have assumed, unrealistically, that all witnesses share exactly the same decision criteria. When criterial variance is included in computational models, differential filler siphoning emerges. The result proves dissociation between ROC curves and underlying discriminability: Higher ROC curves for lineups than for showups and for fair than for biased lineups despite no increase in underlying discriminability. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Ratner, Nan Bernstein; MacWhinney, Brian
2016-05-01
In this article, we review the advantages of language sample analysis (LSA) and explain how clinicians can make the process of LSA faster, easier, more accurate, and more insightful than LSA done "by hand" by using free, available software programs such as Computerized Language Analysis (CLAN). We demonstrate the utility of CLAN analysis in studying the expressive language of a very large cohort of 24-month-old toddlers tracked in a recent longitudinal study; toddlers in particular are the most likely group to receive LSA by clinicians, but existing reference "norms" for this population are based on fairly small cohorts of children. Finally, we demonstrate how a CLAN utility such as KidEval can now extract potential normative data from the very large number of corpora now available for English and other languages at the Child Language Data Exchange System project site. Most of the LSA measures that we studied appear to show developmental profiles suggesting that they may be of specifically higher value for children at certain ages, because they do not show an even developmental trajectory from 2 to 7 years of age. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.
Dalgin, Rebecca Spirito; Dalgin, M Halim; Metzger, Scott J
2018-05-01
This article focuses on the impact of a peer run warm line as part of the psychiatric recovery process. It utilized data including the Recovery Assessment Scale, community integration measures and crisis service usage. Longitudinal statistical analysis was completed on 48 sets of data from 2011, 2012, and 2013. Although no statistically significant differences were observed for the RAS score, community integration data showed increases in visits to primary care doctors, leisure/recreation activities and socialization with others. This study highlights the complexity of psychiatric recovery and that nonclinical peer services like peer run warm lines may be critical to the process.
USAF solar thermal applications overview
NASA Technical Reports Server (NTRS)
Hauger, J. S.; Simpson, J. A.
1981-01-01
Process heat applications were compared to solar thermal technologies. The generic process heat applications were analyzed for solar thermal technology utilization, using SERI's PROSYS/ECONOMAT model in an end use matching analysis and a separate analysis was made for solar ponds. Solar technologies appear attractive in a large number of applications. Low temperature applications at sites with high insolation and high fuel costs were found to be most attractive. No one solar thermal technology emerges as a clearly universal or preferred technology, however,, solar ponds offer a potential high payoff in a few, selected applications. It was shown that troughs and flat plate systems are cost effective in a large number of applications.
NASA Technical Reports Server (NTRS)
Casas, J. C.; Koziana, J. V.; Saylor, M. S.; Kindle, E. C.
1982-01-01
Problems associated with the development of the measurement of air pollution from satellites (MAPS) experiment program are addressed. The primary thrust of this research was the utilization of the MAPS experiment data in three application areas: low altitude aircraft flights (one to six km); mid altitude aircraft flights (eight to 12 km); and orbiting space platforms. Extensive research work in four major areas of data management was the framework for implementation of the MAPS experiment technique. These areas are: (1) data acquisition; (2) data processing, analysis and interpretation algorithms; (3) data display techniques; and (4) information production.
NASA Technical Reports Server (NTRS)
Klemas, V. (Principal Investigator); Bartlett, D.; Rogers, R.; Reed, L.
1974-01-01
The author has identified the following significant results. Analysis of ERTS-1 color composite images using analogy processing equipment confirmed that all the major wetlands plant species were distinguishable at ERTS-1 scale. Furthermore, human alterations of the coastal zone were easily recognized since such alterations typically involve removal of vegetative cover resulting in a change of spectral signature. The superior spectral resolution of the CCTs as compared with single band or composite imagery has indeed provided good discrimination through digital analysis of the CCTs with the added advantage of rapid production of thematic maps and data.
Pollution profile and biodegradation characteristics of fur-suede processing effluents.
Yildiz Töre, G; Insel, G; Ubay Cokgör, E; Ferlier, E; Kabdaşli, I; Orhon, D
2011-07-01
This study investigated the effect of stream segregation on the biodegradation characteristics of wastewaters generated by fur-suede processing. It was conducted on a plant located in an organized industrial district in Turkey. A detailed in-plant analysis of the process profile and the resulting pollution profile in terms of significant parameters indicated the characteristics of a strong wastewater with a maximum total COD of 4285 mg L(-1), despite the excessive wastewater generation of 205 m3 (ton skin)(-1). Respirometric analysis by model calibration yielded slow biodegradation kinetics and showed that around 50% of the particulate organics were utilized at a rate similar to that of endogenous respiration. A similar analysis on the segregated wastewater streams suggested that biodegradation of the plant effluent is controlled largely by the initial washing/pickling operations. The effect of other effluent streams was not significant due to their relatively low contribution to the overall organic load. The respirometric tests showed that the biodegradation kinetics of the joint treatment plant influent of the district were substantially improved and exhibited typical levels reported for tannery wastewater, so that the inhibitory impact was suppressed to a great extent by dilution and mixing with effluents of the other plants. The chemical treatment step in the joint treatment plant removed the majority of the particulate organics so that 80% of the available COD was utilized in the oxygen uptake rate (OUR) test, a ratio quite compatible with the biodegradable COD fractions of tannery wastewater. Consequently, process kinetics and especially the hydrolysis rate appeared to be significantly improved.
Application of LANDSAT data to the study of urban development in Brasilia
NASA Technical Reports Server (NTRS)
Parada, N. D. J. (Principal Investigator); Deoliveira, M. D. L. N.; Foresti, C.; Niero, M.; Parreira, E. M. D. M. F.
1984-01-01
The urban growth of Brasilia within the last ten years is analyzed with special emphasis on the utilization of remote sensing orbital data and automatic image processing. The urban spatial structure and the monitoring of its temporal changes were examined in a whole and dynamic way by the utilization of MSS-LANDSAT images for June (1973, 1978 and 1983). In order to aid data interpretation, a registration algorithm implemented in the Interactive Multispectral Image Analysis System (IMAGE-100) was utilized aiming at the overlap of multitemporal images. The utilization of suitable digital filters, combined with the images overlap, allowed a rapid identification of areas of possible urban growth and oriented the field work. The results obtained in this work permitted an evaluation of the urban growth of Brasilia, taking as reference the proposal stated for the construction of the city in the Pilot Plan elaborated by Lucio Costa.
Hidden flows and waste processing--an analysis of illustrative futures.
Schiller, F; Raffield, T; Angus, A; Herben, M; Young, P J; Longhurst, P J; Pollard, S J T
2010-12-14
An existing materials flow model is adapted (using Excel and AMBER model platforms) to account for waste and hidden material flows within a domestic environment. Supported by national waste data, the implications of legislative change, domestic resource depletion and waste technology advances are explored. The revised methodology offers additional functionality for economic parameters that influence waste generation and disposal. We explore this accounting system under hypothetical future waste and resource management scenarios, illustrating the utility of the model. A sensitivity analysis confirms that imports, domestic extraction and their associated hidden flows impact mostly on waste generation. The model offers enhanced utility for policy and decision makers with regard to economic mass balance and strategic waste flows, and may promote further discussion about waste technology choice in the context of reducing carbon budgets.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ansong, Charles; Wu, Si; Meng, Da
Characterization of the mature protein complement in cells is crucial for a better understanding of cellular processes on a systems-wide scale. Bottom-up proteomic approaches often lead to loss of critical information about an endogenous protein’s actual state due to post translational modifications (PTMs) and other processes. Top-down approaches that involve analysis of the intact protein can address this concern but present significant analytical challenges related to the separation quality needed, measurement sensitivity, and speed that result in low throughput and limited coverage. Here we used single-dimension ultra high pressure liquid chromatography mass spectrometry to investigate the comprehensive ‘intact’ proteome ofmore » the Gram negative bacterial pathogen Salmonella Typhimurium. Top-down proteomics analysis revealed 563 unique proteins including 1665 proteoforms generated by PTMs, representing the largest microbial top-down dataset reported to date. Our analysis not only confirmed several previously recognized aspects of Salmonella biology and bacterial PTMs in general, but also revealed several novel biological insights. Of particular interest was differential utilization of the protein S-thiolation forms S-glutathionylation and S-cysteinylation in response to infection-like conditions versus basal conditions, which was corroborated by changes in corresponding biosynthetic pathways. This differential utilization highlights underlying metabolic mechanisms that modulate changes in cellular signaling, and represents to our knowledge the first report of S-cysteinylation in Gram negative bacteria. The demonstrated utility of our simple proteome-wide intact protein level measurement strategy for gaining biological insight should promote broader adoption and applications of top-down proteomics approaches.« less
Huang, Xuan-Yi; Yen, Wen-Jiuan; Liu, Shwu-Jiuan; Lin, Chouh-Jiuan
2008-03-01
The aim was to develop a practice theory that can be used to guide the direction of community nursing practice to help clients with schizophrenia and those who care for them. Substantive grounded theory was developed through use of grounded theory method of Strauss and Corbin. Two groups of participants in Taiwan were selected using theoretical sampling: one group consisted of community mental health nurses and the other group was clients with schizophrenia and those who cared for them. The number of participants in each group was determined by theoretical saturation. Semi-structured one-to-one in-depth interviews and unstructured non-participant observation were utilized for data collection. Data analysis involved three stages: open, axial and selective coding. During the process of coding and analysis, both inductive and deductive thinking were utilized and the constant comparative analysis process continued until data saturation occurred. To establish trustworthiness, the four criteria of credibility, transferability, dependability and confirmability were followed along with field trial, audit trial, member check and peer debriefing for reliability and validity. A substantive grounded theory, the role of community mental health nurses caring for people with schizophrenia in Taiwan, was developed through utilization of grounded theory method of Strauss and Corbin. In this paper, results and discussion focus on causal conditions, context, intervening conditions, consequences and phenomenon. The theory is the first to contribute knowledge about the field of mental health home visiting services in Taiwan to provide guidance for the delivery of quality care to assist people in the community with schizophrenia and their carers.
Anlysis capabilities for plutonium-238 programs
NASA Astrophysics Data System (ADS)
Wong, A. S.; Rinehart, G. H.; Reimus, M. H.; Pansoy-Hjelvik, M. E.; Moniz, P. F.; Brock, J. C.; Ferrara, S. E.; Ramsey, S. S.
2000-07-01
In this presentation, an overview of analysis capabilities that support 238Pu programs will be discussed. These capabilities include neutron emission rate and calorimetric measurements, metallography/ceramography, ultrasonic examination, particle size determination, and chemical analyses. The data obtained from these measurements provide baseline parameters for fuel clad impact testing, fuel processing, product certifications, and waste disposal. Also several in-line analyses capabilities will be utilized for process control in the full-scale 238Pu Aqueous Scrap Recovery line in FY01.
NASA Technical Reports Server (NTRS)
Steele, Gynelle C.
1999-01-01
The NASA Lewis Research Center and Flow Parametrics will enter into an agreement to commercialize the National Combustion Code (NCC). This multidisciplinary combustor design system utilizes computer-aided design (CAD) tools for geometry creation, advanced mesh generators for creating solid model representations, a common framework for fluid flow and structural analyses, modern postprocessing tools, and parallel processing. This integrated system can facilitate and enhance various phases of the design and analysis process.
Hydrothermal Liquefaction Treatment Preliminary Hazard Analysis Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lowry, Peter P.; Wagner, Katie A.
A preliminary hazard assessment was completed during February 2015 to evaluate the conceptual design of the modular hydrothermal liquefaction treatment system. The hazard assessment was performed in 2 stages. An initial assessment utilizing Hazard Identification and Preliminary Hazards Analysis (PHA) techniques identified areas with significant or unique hazards (process safety-related hazards) that fall outside of the normal operating envelope of PNNL and warranted additional analysis. The subsequent assessment was based on a qualitative What-If analysis. This analysis was augmented, as necessary, by additional quantitative analysis for scenarios involving a release of hazardous material or energy with the potential for affectingmore » the public.« less
Utility-based early modulation of processing distracting stimulus information.
Wendt, Mike; Luna-Rodriguez, Aquiles; Jacobsen, Thomas
2014-12-10
Humans are selective information processors who efficiently prevent goal-inappropriate stimulus information to gain control over their actions. Nonetheless, stimuli, which are both unnecessary for solving a current task and liable to cue an incorrect response (i.e., "distractors"), frequently modulate task performance, even when consistently paired with a physical feature that makes them easily discernible from target stimuli. Current models of cognitive control assume adjustment of the processing of distractor information based on the overall distractor utility (e.g., predictive value regarding the appropriate response, likelihood to elicit conflict with target processing). Although studies on distractor interference have supported the notion of utility-based processing adjustment, previous evidence is inconclusive regarding the specificity of this adjustment for distractor information and the stage(s) of processing affected. To assess the processing of distractors during sensory-perceptual phases we applied EEG recording in a stimulus identification task, involving successive distractor-target presentation, and manipulated the overall distractor utility. Behavioral measures replicated previously found utility modulations of distractor interference. Crucially, distractor-evoked visual potentials (i.e., posterior N1) were more pronounced in high-utility than low-utility conditions. This effect generalized to distractors unrelated to the utility manipulation, providing evidence for item-unspecific adjustment of early distractor processing to the experienced utility of distractor information. Copyright © 2014 the authors 0270-6474/14/3416720-06$15.00/0.
TableViewer for Herschel Data Processing
NASA Astrophysics Data System (ADS)
Zhang, L.; Schulz, B.
2006-07-01
The TableViewer utility is a GUI tool written in Java to support interactive data processing and analysis for the Herschel Space Observatory (Pilbratt et al. 2001). The idea was inherited from a prototype written in IDL (Schulz et al. 2005). It allows to graphically view and analyze tabular data organized in columns with equal numbers of rows. It can be run either as a standalone application, where data access is restricted to FITS (FITS 1999) files only, or it can be run from the Quick Look Analysis(QLA) or Interactive Analysis(IA) command line, from where also objects are accessible. The graphic display is very versatile, allowing plots in either linear or log scales. Zooming, panning, and changing data columns is performed rapidly using a group of navigation buttons. Selecting and de-selecting of fields of data points controls the input to simple analysis tasks like building a statistics table, or generating power spectra. The binary data stored in a TableDataset^1, a Product or in FITS files can also be displayed as tabular data, where values in individual cells can be modified. TableViewer provides several processing utilities which, besides calculation of statistics either for all channels or for selected channels, and calculation of power spectra, allows to convert/repair datasets by changing the unit name of data columns, and by modifying data values in columns with a simple calculator tool. Interactively selected data can be separated out, and modified data sets can be saved to FITS files. The tool will be very helpful especially in the early phases of Herschel data analysis when a quick access to contents of data products is important. TableDataset and Product are Java classes defined in herschel.ia.dataset.
Xi-cam: a versatile interface for data visualization and analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pandolfi, Ronald J.; Allan, Daniel B.; Arenholz, Elke
Xi-cam is an extensible platform for data management, analysis and visualization.Xi-camaims to provide a flexible and extensible approach to synchrotron data treatment as a solution to rising demands for high-volume/high-throughput processing pipelines. The core ofXi-camis an extensible plugin-based graphical user interface platform which provides users with an interactive interface to processing algorithms. Plugins are available for SAXS/WAXS/GISAXS/GIWAXS, tomography and NEXAFS data. WithXi-cam's `advanced' mode, data processing steps are designed as a graph-based workflow, which can be executed live, locally or remotely. Remote execution utilizes high-performance computing or de-localized resources, allowing for the effective reduction of high-throughput data.Xi-cam's plugin-based architecture targetsmore » cross-facility and cross-technique collaborative development, in support of multi-modal analysis.Xi-camis open-source and cross-platform, and available for download on GitHub.« less
Xi-cam: a versatile interface for data visualization and analysis
Pandolfi, Ronald J.; Allan, Daniel B.; Arenholz, Elke; ...
2018-05-31
Xi-cam is an extensible platform for data management, analysis and visualization.Xi-camaims to provide a flexible and extensible approach to synchrotron data treatment as a solution to rising demands for high-volume/high-throughput processing pipelines. The core ofXi-camis an extensible plugin-based graphical user interface platform which provides users with an interactive interface to processing algorithms. Plugins are available for SAXS/WAXS/GISAXS/GIWAXS, tomography and NEXAFS data. WithXi-cam's `advanced' mode, data processing steps are designed as a graph-based workflow, which can be executed live, locally or remotely. Remote execution utilizes high-performance computing or de-localized resources, allowing for the effective reduction of high-throughput data.Xi-cam's plugin-based architecture targetsmore » cross-facility and cross-technique collaborative development, in support of multi-modal analysis.Xi-camis open-source and cross-platform, and available for download on GitHub.« less
2011-01-01
Background There is a lack of acceptable, reliable, and valid survey instruments to measure conceptual research utilization (CRU). In this study, we investigated the psychometric properties of a newly developed scale (the CRU Scale). Methods We used the Standards for Educational and Psychological Testing as a validation framework to assess four sources of validity evidence: content, response processes, internal structure, and relations to other variables. A panel of nine international research utilization experts performed a formal content validity assessment. To determine response process validity, we conducted a series of one-on-one scale administration sessions with 10 healthcare aides. Internal structure and relations to other variables validity was examined using CRU Scale response data from a sample of 707 healthcare aides working in 30 urban Canadian nursing homes. Principal components analysis and confirmatory factor analyses were conducted to determine internal structure. Relations to other variables were examined using: (1) bivariate correlations; (2) change in mean values of CRU with increasing levels of other kinds of research utilization; and (3) multivariate linear regression. Results Content validity index scores for the five items ranged from 0.55 to 1.00. The principal components analysis predicted a 5-item 1-factor model. This was inconsistent with the findings from the confirmatory factor analysis, which showed best fit for a 4-item 1-factor model. Bivariate associations between CRU and other kinds of research utilization were statistically significant (p < 0.01) for the latent CRU scale score and all five CRU items. The CRU scale score was also shown to be significant predictor of overall research utilization in multivariate linear regression. Conclusions The CRU scale showed acceptable initial psychometric properties with respect to responses from healthcare aides in nursing homes. Based on our validity, reliability, and acceptability analyses, we recommend using a reduced (four-item) version of the CRU scale to yield sound assessments of CRU by healthcare aides. Refinement to the wording of one item is also needed. Planned future research will include: latent scale scoring, identification of variables that predict and are outcomes to conceptual research use, and longitudinal work to determine CRU Scale sensitivity to change. PMID:21595888
Palchaudhuri, Sonali; Tweya, Hannock; Hosseinipour, Mina
2014-06-01
The 2011 Malawi HIV guidelines promote CD4 monitoring for pre-ART assessment and considering HIVRNA monitoring for ART response assessment, while some clinics used CD4 for both. We assessed clinical ordering practices as compared to guidelines, and determined whether the samples were successfully and promptly processed. We conducted a retrospective review of all patients seen in from August 2010 through July 2011,, in two urban HIV-care clinics that utilized 6-monthly CD4 monitoring regardless of ART status. We calculated the percentage of patients on whom clinicians ordered CD4 or HIVRNA analysis. For all samples sent, we determined rates of successful lab-processing, and mean time to returned results. Of 20581 patients seen, 8029 (39%) had at least one blood draw for CD4 count. Among pre-ART patients, 2668/2844 (93.8%) had CD4 counts performed for eligibility. Of all CD4 samples sent, 8082/9207 (89%) samples were successfully processed. Of those, mean time to processing was 1.6 days (s.d 1.5) but mean time to results being available to clinician was 9.3 days (s.d. 3.7). Regarding HIVRNA, 172 patients of 17737 on ART had a blood draw and only 118/213 (55%) samples were successfully processed. Mean processing time was 39.5 days (s.d. 21.7); mean time to results being available to clinician was 43.1 days (s.d. 25.1). During the one-year evaluated, there were multiple lapses in processing HIVRNA samples for up to 2 months. Clinicians underutilize CD4 and HIVRNA as monitoring tools in HIV care. Laboratory processing failures and turnaround times are unacceptably high for viral load analysis. Alternative strategies need to be considered in order to meet laboratory monitoring needs.
Research on the use of space resources
NASA Technical Reports Server (NTRS)
Carroll, W. F. (Editor)
1983-01-01
The second year of a multiyear research program on the processing and use of extraterrestrial resources is covered. The research tasks included: (1) silicate processing, (2) magma electrolysis, (3) vapor phase reduction, and (4) metals separation. Concomitant studies included: (1) energy systems, (2) transportation systems, (3) utilization analysis, and (4) resource exploration missions. Emphasis in fiscal year 1982 was placed on the magma electrolysis and vapor phase reduction processes (both analytical and experimental) for separation of oxygen and metals from lunar regolith. The early experimental work on magma electrolysis resulted in gram quantities of iron (mixed metals) and the identification of significant anode, cathode, and container problems. In the vapor phase reduction tasks a detailed analysis of various process concepts led to the selection of two specific processes designated as ""Vapor Separation'' and ""Selective Ionization.'' Experimental work was deferred to fiscal year 1983. In the Silicate Processing task a thermophysical model of the casting process was developed and used to study the effect of variations in material properties on the cooling behavior of lunar basalt.
Evaluating a Policing Strategy Intended to Disrupt an Illicit Street-Level Drug Market
ERIC Educational Resources Information Center
Corsaro, Nicholas; Brunson, Rod K.; McGarrell, Edmund F.
2010-01-01
The authors examined a strategic policing initiative that was implemented in a high crime Nashville, Tennessee neighborhood by utilizing a mixed-methodological evaluation approach in order to provide (a) a descriptive process assessment of program fidelity; (b) an interrupted time-series analysis relying upon generalized linear models; (c)…
Thermosets of epoxy monomer from Tung oil fatty acids cured in two synergistic ways
USDA-ARS?s Scientific Manuscript database
A new epoxy monomer from tung oil fatty acids, glycidyl ester of eleostearic acid (GEEA), was synthesized and characterized by 1H-NMR and 13C-NMR spectroscopy. Differential scanning calorimetry analysis (DSC) and FT-IR were utilized to investigate the curing process of GEEA cured by both dienophiles...
ERIC Educational Resources Information Center
Howes, Andrew; Lewis, Richard L.; Vera, Alonso
2009-01-01
The authors assume that individuals adapt rationally to a utility function given constraints imposed by their cognitive architecture and the local task environment. This assumption underlies a new approach to modeling and understanding cognition--cognitively bounded rational analysis--that sharpens the predictive acuity of general, integrated…
2006-06-01
KMO ) for the CFMCC staff. That officer had a daily meeting with all of the CFMCC’s collateral duty knowledge managers (KM) to discuss information...analyses of process steps) and mentored by the KMO , could enhance knowledge creation and utilization while not jeopardizing work flows. Clearly in
USDA-ARS?s Scientific Manuscript database
Morphological components of biomass stems vary in their chemical composition and they can be better utilized when processed after segregation. Within the stem, nodes and internodes have significantly different compositions. The internodes have low ash content and are a better feedstock for bioenergy...
Analysis of Emission Reduction Strategies for Power Boilers in the US Pulp and Paper Industry.
The U.S. pulp and paper industry utilizes a variety of fuels to provide energy for process needs. Energy production results in air emissions of sulfur dioxide (SO2), nitrogen oxides (NOX), particulate matter (PM), and greenhouse gases such as carbon dioxide (CO2). Air emissions f...
Development of state and transition model assumptions used in National Forest Plan revision
Eric B. Henderson
2008-01-01
State and transition models are being utilized in forest management analysis processes to evaluate assumptions about disturbances and succession. These models assume valid information about seral class successional pathways and timing. The Forest Vegetation Simulator (FVS) was used to evaluate seral class succession assumptions for the Hiawatha National Forest in...
Bootsie: estimation of coefficient of variation of AFLP data by bootstrap analysis
USDA-ARS?s Scientific Manuscript database
Bootsie is an English-native replacement for ASG Coelho’s “DBOOT” utility for estimating coefficient of variation of a population of AFLP marker data using bootstrapping. Bootsie improves on DBOOT by supporting batch processing, time-to-completion estimation, built-in graphs, and a suite of export t...
An Exploratory Study of Animal-Assisted Interventions Utilized by Mental Health Professionals
ERIC Educational Resources Information Center
O'Callaghan, Dana M.; Chandler, Cynthia K.
2011-01-01
This study implemented an exploratory analysis to examine how a sample of mental health professionals incorporates specific animal-assisted techniques into the therapeutic process. An extensive review of literature related to animal-assisted therapy (AAT) resulted in the identification of 18 techniques and 10 intentions for the practice of AAT in…
ERIC Educational Resources Information Center
Olmsted, John
1984-01-01
Describes a five-period experiment which: (1) integrates preparative and analytical techniques; (2) utilizes a photochemical reaction that excites student interest both from visual impact and as an introduction to photoinduced processes; (3) provides accurate results; and (4) costs less than $0.20 per student per laboratory session. (JN)
NASA Astrophysics Data System (ADS)
Marpaung, B. O. Y.; Waginah
2018-03-01
Every existence of community settlements that formed has related to social, culture, and economy that exists in that society. Participation is a process that involving human interaction towards each other, of these interactions creates activities that potentially form a new space (Hendriksen, et al., 2012). Problems in this research are related to community involvement in building residential, determining land used, building roads, and utilities in Kampung Nelayan Belawan Medan residential. The aim of this research is to find the community involvement of building residential, determining land used, building roads, and utilities in Kampung Nelayan Belawan Medan residential. In the process of collecting data, researchers conducted field observation and interviews. Then the researchers connect the theory and interpretation of data in determining the method of data analysis. Then the researchers connect the theory and interpretation of data in determining the method of data analysis. The discovery of this research is that the formation of settlement spaces in the fishing village is inseparable from the participation in Kampung Nelayan Belawan Medan residential.
Hu, Qin-xue; Barry, Ashley Perkins; Wang, Zi-xuan; Connolly, Shanon M.; Peiper, Stephen C.; Greenberg, Michael L.
2000-01-01
The evolution of human immunodeficiency virus type 1 infection is associated with a shift in the target cell population, driven by variability in coreceptor utilization resulting from diversity in env. To elucidate the potential consequences of these changes for Env-mediated fusion over the course of AIDS, we examined the biological properties of serial viral isolates and determined coreceptor utilization by the products of env cloned from two individuals, followed from the detection of seroconversion throughout the course of their infection. One had a typical course, and the other had an accelerated progression. Early isolates were non-syncytium inducing, and the corresponding Env exclusively utilized CCR5, whereas Env from late phases of infection showed restricted utilization of CXCR4 in both patients. Env from subject SC24, who had a standard progression, demonstrated multitropism, manifested by utilization of CCR3, CXCR4, and CCR5 in the intervening period. In contrast, Env from patient SC51, who experienced early conversion to the syncytium-inducing phenotype, developed dualtropic coreceptor utilization of CCR5 and CXCR4. Genetic analysis of env from each isolate revealed that those with an X4 phenotype formed a distinct subcluster within each subject. Analysis of chimeras constructed from R5 and multispecific env from patient SC24 demonstrated that while the V3 domain played a dominant role in determining coreceptor utilization, sequences in the V4–V5 region also contributed to the latter phenotype. Immunoprecipitation experiments confirmed that the hybrid Env proteins were expressed at similar levels. These experiments demonstrate that progression from the R5 to X4 phenotype may occur through a multi- or dual-tropic intermediate and that multiple domains contribute to this process. PMID:11090186
Effects of eHealth Literacy on General Practitioner Consultations: A Mediation Analysis.
Schulz, Peter Johannes; Fitzpatrick, Mary Anne; Hess, Alexandra; Sudbury-Riley, Lynn; Hartung, Uwe
2017-05-16
Most evidence (not all) points in the direction that individuals with a higher level of health literacy will less frequently utilize the health care system than individuals with lower levels of health literacy. The underlying reasons of this effect are largely unclear, though people's ability to seek health information independently at the time of wide availability of such information on the Internet has been cited in this context. We propose and test two potential mediators of the negative effect of eHealth literacy on health care utilization: (1) health information seeking and (2) gain in empowerment by information seeking. Data were collected in New Zealand, the United Kingdom, and the United States using a Web-based survey administered by a company specialized on providing online panels. Combined, the three samples resulted in a total of 996 baby boomers born between 1946 and 1965 who had used the Internet to search for and share health information in the previous 6 months. Measured variables include eHealth literacy, Internet health information seeking, the self-perceived gain in empowerment by that information, and the number of consultations with one's general practitioner (GP). Path analysis was employed for data analysis. We found a bundle of indirect effect paths showing a positive relationship between health literacy and health care utilization: via health information seeking (Path 1), via gain in empowerment (Path 2), and via both (Path 3). In addition to the emergence of these indirect effects, the direct effect of health literacy on health care utilization disappeared. The indirect paths from health literacy via information seeking and empowerment to GP consultations can be interpreted as a dynamic process and an expression of the ability to find, process, and understand relevant information when that is necessary. ©Peter Johannes Schulz, Mary Anne Fitzpatrick, Alexandra Hess, Lynn Sudbury-Riley, Uwe Hartung. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 16.05.2017.
Monitoring the quality of welding based on welding current and ste analysis
NASA Astrophysics Data System (ADS)
Mazlan, Afidatusshimah; Daniyal, Hamdan; Izzani Mohamed, Amir; Ishak, Mahadzir; Hadi, Amran Abdul
2017-10-01
Qualities of welding play an important part in industry especially in manufacturing field. Post-welding non-destructive test is one of the importance process to ensure the quality of welding but it is time consuming and costly. To reduce the chance of defects, online monitoring had been utilized by continuously sense some of welding parameters and predict welding quality. One of the parameters is welding current, which is rich of information but lack of study focus on extract them at signal analysis level. This paper presents the analysis of welding current using Short Time Energy (STE) signal processing to quantify the pattern of the current. GMAW set with carbon steel specimens are used in this experimental study with high-bandwidth and high sampling rate oscilloscope capturing the welding current. The results indicate welding current as signatures have high correlation with the welding process. Continue with STE analysis, the value below 5000 is declare as good welding, meanwhile the STE value more than 6000 is contained defect.
Biometric Attendance and Big Data Analysis for Optimizing Work Processes.
Verma, Neetu; Xavier, Teenu; Agrawal, Deepak
2016-01-01
Although biometric attendance management is available, large healthcare organizations have difficulty in big data analysis for optimization of work processes. The aim of this project was to assess the implementation of a biometric attendance system and its utility following big data analysis. In this prospective study the implementation of biometric system was evaluated over 3 month period at our institution. Software integration with other existing systems for data analysis was also evaluated. Implementation of the biometric system could be successfully done over a two month period with enrollment of 10,000 employees into the system. However generating reports and taking action this large number of staff was a challenge. For this purpose software was made for capturing the duty roster of each employee and integrating it with the biometric system and adding an SMS gateway. This helped in automating the process of sending SMSs to each employee who had not signed in. Standalone biometric systems have limited functionality in large organizations unless it is meshed with employee duty roster.
Watts, Adreanna T M; Tootell, Anne V; Fix, Spencer T; Aviyente, Selin; Bernat, Edward M
2018-04-29
The neurophysiological mechanisms involved in the evaluation of performance feedback have been widely studied in the ERP literature over the past twenty years, but understanding has been limited by the use of traditional time-domain amplitude analytic approaches. Gambling outcome valence has been identified as an important factor modulating event-related potential (ERP) components, most notably the feedback negativity (FN). Recent work employing time-frequency analysis has shown that processes indexed by the FN are confounded in the time-domain and can be better represented as separable feedback-related processes in the theta (3-7 Hz) and delta (0-3 Hz) frequency bands. In addition to time-frequency amplitude analysis, phase synchrony measures have begun to further our understanding of performance evaluation by revealing how feedback information is processed within and between various brain regions. The current study aimed to provide an integrative assessment of time-frequency amplitude, inter-trial phase synchrony, and inter-channel phase synchrony changes following monetary feedback in a gambling task. Results revealed that time-frequency amplitude activity explained separable loss and gain processes confounded in the time-domain. Furthermore, phase synchrony measures explained unique variance above and beyond amplitude measures and demonstrated enhanced functional integration between medial prefrontal and bilateral frontal, motor, and occipital regions for loss relative to gain feedback. These findings demonstrate the utility of assessing time-frequency amplitude, inter-trial phase synchrony, and inter-channel phase synchrony together to better elucidate the neurophysiology of feedback processing. Copyright © 2017. Published by Elsevier B.V.
Climate Change Mitigation Challenge for Wood Utilization-The Case of Finland.
Soimakallio, Sampo; Saikku, Laura; Valsta, Lauri; Pingoud, Kim
2016-05-17
The urgent need to mitigate climate change invokes both opportunities and challenges for forest biomass utilization. Fossil fuels can be substituted by using wood products in place of alternative materials and energy, but wood harvesting reduces forest carbon sink and processing of wood products requires material and energy inputs. We assessed the extended life cycle carbon emissions considering substitution impacts for various wood utilization scenarios over 100 years from 2010 onward for Finland. The scenarios were based on various but constant wood utilization structures reflecting current and anticipated mix of wood utilization activities. We applied stochastic simulation to deal with the uncertainty in a number of input variables required. According to our analysis, the wood utilization decrease net carbon emissions with a probability lower than 40% for each of the studied scenarios. Furthermore, large emission reductions were exceptionally unlikely. The uncertainty of the results were influenced clearly the most by the reduction in the forest carbon sink. There is a significant trade-off between avoiding emissions through fossil fuel substitution and reduction in forest carbon sink due to wood harvesting. This creates a major challenge for forest management practices and wood utilization activities in responding to ambitious climate change mitigation targets.
Uncertainty Analysis of NASA Glenn's 8- by 6-Foot Supersonic Wind Tunnel
NASA Technical Reports Server (NTRS)
Stephens, Julia E.; Hubbard, Erin P.; Walter, Joel A.; McElroy, Tyler
2016-01-01
An analysis was performed to determine the measurement uncertainty of the Mach Number of the 8- by 6-foot Supersonic Wind Tunnel at the NASA Glenn Research Center. This paper details the analysis process used, including methods for handling limited data and complicated data correlations. Due to the complexity of the equations used, a Monte Carlo Method was utilized for this uncertainty analysis. A summary of the findings are presented as pertains to understanding what the uncertainties are, how they impact various research tests in the facility, and methods of reducing the uncertainties in the future.
Automating security monitoring and analysis for Space Station Freedom's electric power system
NASA Technical Reports Server (NTRS)
Dolce, James L.; Sobajic, Dejan J.; Pao, Yoh-Han
1990-01-01
Operating a large, space power system requires classifying the system's status and analyzing its security. Conventional algorithms are used by terrestrial electric utilities to provide such information to their dispatchers, but their application aboard Space Station Freedom will consume too much processing time. A new approach for monitoring and analysis using adaptive pattern techniques is presented. This approach yields an on-line security monitoring and analysis algorithm that is accurate and fast; and thus, it can free the Space Station Freedom's power control computers for other tasks.
Automating security monitoring and analysis for Space Station Freedom's electric power system
NASA Technical Reports Server (NTRS)
Dolce, James L.; Sobajic, Dejan J.; Pao, Yoh-Han
1990-01-01
Operating a large, space power system requires classifying the system's status and analyzing its security. Conventional algorithms are used by terrestrial electric utilities to provide such information to their dispatchers, but their application aboard Space Station Freedom will consume too much processing time. A novel approach for monitoring and analysis using adaptive pattern techniques is presented. This approach yields an on-line security monitoring and analysis algorithm that is accurate and fast; and thus, it can free the Space Station Freedom's power control computers for other tasks.
Aletor, V A; Olonimoyo, F I
1992-01-01
The comparative utilization of differently processed (roasted, cooked and oil cake) soya bean base diets and groundnut cake diet were evaluated in a feeding trial using 100 day-old Anak broiler-chicks. The response criteria included performance, protein utilization, relative organ weights, carcass traits and economy of production. At the end of the feeding trial, the average weight gains of chicks fed processed soya bean diets were significantly (P less than 0.05) higher than those fed groundnut cake and raw soya bean diets. Both feed consumption and efficiency were significantly (P less than 0.05) enhanced by processing. For example, feed consumption was highest in the chicks fed soya bean oil cake and least in those fed raw bean. Feed efficiency was best in chicks fed roasted soya bean. The relative weights [g/100 g body wt.] of the liver, kidneys, lungs, heart, gizzard and bursa were not significantly affected by the differently processed soya bean while the raw bean (unprocessed) significantly (P less than 0.01) increased pancreas weight. The dressed weight [%], eviscerated weight [%] and the relative weight of the thigh, drumsticks, chest, back and head were not significantly influenced by the dietary treatments. However, the relative weights of the shank and belly fat were significantly (P less than 0.05) affected. Cost-benefit analysis showed that the processed soya bean gave higher profit than groundnut cake diet. Among the soya bean diets, profit was in the order: roasted greater than cooked greater than oil cake greater than raw bean.
An Ibm PC/AT-Based Image Acquisition And Processing System For Quantitative Image Analysis
NASA Astrophysics Data System (ADS)
Kim, Yongmin; Alexander, Thomas
1986-06-01
In recent years, a large number of applications have been developed for image processing systems in the area of biological imaging. We have already finished the development of a dedicated microcomputer-based image processing and analysis system for quantitative microscopy. The system's primary function has been to facilitate and ultimately automate quantitative image analysis tasks such as the measurement of cellular DNA contents. We have recognized from this development experience, and interaction with system users, biologists and technicians, that the increasingly widespread use of image processing systems, and the development and application of new techniques for utilizing the capabilities of such systems, would generate a need for some kind of inexpensive general purpose image acquisition and processing system specially tailored for the needs of the medical community. We are currently engaged in the development and testing of hardware and software for a fairly high-performance image processing computer system based on a popular personal computer. In this paper, we describe the design and development of this system. Biological image processing computer systems have now reached a level of hardware and software refinement where they could become convenient image analysis tools for biologists. The development of a general purpose image processing system for quantitative image analysis that is inexpensive, flexible, and easy-to-use represents a significant step towards making the microscopic digital image processing techniques more widely applicable not only in a research environment as a biologist's workstation, but also in clinical environments as a diagnostic tool.
The process and utility of classification and regression tree methodology in nursing research
Kuhn, Lisa; Page, Karen; Ward, John; Worrall-Carter, Linda
2014-01-01
Aim This paper presents a discussion of classification and regression tree analysis and its utility in nursing research. Background Classification and regression tree analysis is an exploratory research method used to illustrate associations between variables not suited to traditional regression analysis. Complex interactions are demonstrated between covariates and variables of interest in inverted tree diagrams. Design Discussion paper. Data sources English language literature was sourced from eBooks, Medline Complete and CINAHL Plus databases, Google and Google Scholar, hard copy research texts and retrieved reference lists for terms including classification and regression tree* and derivatives and recursive partitioning from 1984–2013. Discussion Classification and regression tree analysis is an important method used to identify previously unknown patterns amongst data. Whilst there are several reasons to embrace this method as a means of exploratory quantitative research, issues regarding quality of data as well as the usefulness and validity of the findings should be considered. Implications for Nursing Research Classification and regression tree analysis is a valuable tool to guide nurses to reduce gaps in the application of evidence to practice. With the ever-expanding availability of data, it is important that nurses understand the utility and limitations of the research method. Conclusion Classification and regression tree analysis is an easily interpreted method for modelling interactions between health-related variables that would otherwise remain obscured. Knowledge is presented graphically, providing insightful understanding of complex and hierarchical relationships in an accessible and useful way to nursing and other health professions. PMID:24237048
The process and utility of classification and regression tree methodology in nursing research.
Kuhn, Lisa; Page, Karen; Ward, John; Worrall-Carter, Linda
2014-06-01
This paper presents a discussion of classification and regression tree analysis and its utility in nursing research. Classification and regression tree analysis is an exploratory research method used to illustrate associations between variables not suited to traditional regression analysis. Complex interactions are demonstrated between covariates and variables of interest in inverted tree diagrams. Discussion paper. English language literature was sourced from eBooks, Medline Complete and CINAHL Plus databases, Google and Google Scholar, hard copy research texts and retrieved reference lists for terms including classification and regression tree* and derivatives and recursive partitioning from 1984-2013. Classification and regression tree analysis is an important method used to identify previously unknown patterns amongst data. Whilst there are several reasons to embrace this method as a means of exploratory quantitative research, issues regarding quality of data as well as the usefulness and validity of the findings should be considered. Classification and regression tree analysis is a valuable tool to guide nurses to reduce gaps in the application of evidence to practice. With the ever-expanding availability of data, it is important that nurses understand the utility and limitations of the research method. Classification and regression tree analysis is an easily interpreted method for modelling interactions between health-related variables that would otherwise remain obscured. Knowledge is presented graphically, providing insightful understanding of complex and hierarchical relationships in an accessible and useful way to nursing and other health professions. © 2013 The Authors. Journal of Advanced Nursing Published by John Wiley & Sons Ltd.
Partial differential equation transform — Variational formulation and Fourier analysis
Wang, Yang; Wei, Guo-Wei; Yang, Siyang
2011-01-01
Nonlinear partial differential equation (PDE) models are established approaches for image/signal processing, data analysis and surface construction. Most previous geometric PDEs are utilized as low-pass filters which give rise to image trend information. In an earlier work, we introduced mode decomposition evolution equations (MoDEEs), which behave like high-pass filters and are able to systematically provide intrinsic mode functions (IMFs) of signals and images. Due to their tunable time-frequency localization and perfect reconstruction, the operation of MoDEEs is called a PDE transform. By appropriate selection of PDE transform parameters, we can tune IMFs into trends, edges, textures, noise etc., which can be further utilized in the secondary processing for various purposes. This work introduces the variational formulation, performs the Fourier analysis, and conducts biomedical and biological applications of the proposed PDE transform. The variational formulation offers an algorithm to incorporate two image functions and two sets of low-pass PDE operators in the total energy functional. Two low-pass PDE operators have different signs, leading to energy disparity, while a coupling term, acting as a relative fidelity of two image functions, is introduced to reduce the disparity of two energy components. We construct variational PDE transforms by using Euler-Lagrange equation and artificial time propagation. Fourier analysis of a simplified PDE transform is presented to shed light on the filter properties of high order PDE transforms. Such an analysis also offers insight on the parameter selection of the PDE transform. The proposed PDE transform algorithm is validated by numerous benchmark tests. In one selected challenging example, we illustrate the ability of PDE transform to separate two adjacent frequencies of sin(x) and sin(1.1x). Such an ability is due to PDE transform’s controllable frequency localization obtained by adjusting the order of PDEs. The frequency selection is achieved either by diffusion coefficients or by propagation time. Finally, we explore a large number of practical applications to further demonstrate the utility of proposed PDE transform. PMID:22207904
Concurrent Image Processing Executive (CIPE). Volume 1: Design overview
NASA Technical Reports Server (NTRS)
Lee, Meemong; Groom, Steven L.; Mazer, Alan S.; Williams, Winifred I.
1990-01-01
The design and implementation of a Concurrent Image Processing Executive (CIPE), which is intended to become the support system software for a prototype high performance science analysis workstation are described. The target machine for this software is a JPL/Caltech Mark 3fp Hypercube hosted by either a MASSCOMP 5600 or a Sun-3, Sun-4 workstation; however, the design will accommodate other concurrent machines of similar architecture, i.e., local memory, multiple-instruction-multiple-data (MIMD) machines. The CIPE system provides both a multimode user interface and an applications programmer interface, and has been designed around four loosely coupled modules: user interface, host-resident executive, hypercube-resident executive, and application functions. The loose coupling between modules allows modification of a particular module without significantly affecting the other modules in the system. In order to enhance hypercube memory utilization and to allow expansion of image processing capabilities, a specialized program management method, incremental loading, was devised. To minimize data transfer between host and hypercube, a data management method which distributes, redistributes, and tracks data set information was implemented. The data management also allows data sharing among application programs. The CIPE software architecture provides a flexible environment for scientific analysis of complex remote sensing image data, such as planetary data and imaging spectrometry, utilizing state-of-the-art concurrent computation capabilities.
Gu, Di; Gao, Simeng; Jiang, TingTing; Wang, Baohui
2017-03-15
To match the relentless pursuit of three research hot points - efficient solar utilization, green and sustainable remediation of wastewater and advanced oxidation processes, solar-mediated thermo-electrochemical oxidation of surfactant was proposed and developed for green remediation of surfactant wastewater. The solar thermal electrochemical process (STEP), fully driven with solar energy to electric energy and heat and without an input of other energy, sustainably serves as efficient thermo-electrochemical oxidation of surfactant, exemplified by SDBS, in wastewater with the synergistic production of hydrogen. The electrooxidation-resistant surfactant is thermo-electrochemically oxidized to CO 2 while hydrogen gas is generated by lowing effective oxidation potential and suppressing the oxidation activation energy originated from the combination of thermochemical and electrochemical effect. A clear conclusion on the mechanism of SDBS degradation can be proposed and discussed based on the theoretical analysis of electrochemical potential by quantum chemical method and experimental analysis of the CV, TG, GC, FT-IR, UV-vis, Fluorescence spectra and TOC. The degradation data provide a pilot for the treatment of SDBS wastewater that appears to occur via desulfonation followed by aromatic-ring opening. The solar thermal utilization that can initiate the desulfonation and activation of SDBS becomes one key step in the degradation process.
Gu, Di; Gao, Simeng; Jiang, TingTing; Wang, Baohui
2017-01-01
To match the relentless pursuit of three research hot points - efficient solar utilization, green and sustainable remediation of wastewater and advanced oxidation processes, solar-mediated thermo-electrochemical oxidation of surfactant was proposed and developed for green remediation of surfactant wastewater. The solar thermal electrochemical process (STEP), fully driven with solar energy to electric energy and heat and without an input of other energy, sustainably serves as efficient thermo-electrochemical oxidation of surfactant, exemplified by SDBS, in wastewater with the synergistic production of hydrogen. The electrooxidation-resistant surfactant is thermo-electrochemically oxidized to CO2 while hydrogen gas is generated by lowing effective oxidation potential and suppressing the oxidation activation energy originated from the combination of thermochemical and electrochemical effect. A clear conclusion on the mechanism of SDBS degradation can be proposed and discussed based on the theoretical analysis of electrochemical potential by quantum chemical method and experimental analysis of the CV, TG, GC, FT-IR, UV-vis, Fluorescence spectra and TOC. The degradation data provide a pilot for the treatment of SDBS wastewater that appears to occur via desulfonation followed by aromatic-ring opening. The solar thermal utilization that can initiate the desulfonation and activation of SDBS becomes one key step in the degradation process. PMID:28294180
Seismic joint analysis for non-destructive testing of asphalt and concrete slabs
Ryden, N.; Park, C.B.
2005-01-01
A seismic approach is used to estimate the thickness and elastic stiffness constants of asphalt or concrete slabs. The overall concept of the approach utilizes the robustness of the multichannel seismic method. A multichannel-equivalent data set is compiled from multiple time series recorded from multiple hammer impacts at progressively different offsets from a fixed receiver. This multichannel simulation with one receiver (MSOR) replaces the true multichannel recording in a cost-effective and convenient manner. A recorded data set is first processed to evaluate the shear wave velocity through a wave field transformation, normally used in the multichannel analysis of surface waves (MASW) method, followed by a Lambwave inversion. Then, the same data set is used to evaluate compression wave velocity from a combined processing of the first-arrival picking and a linear regression. Finally, the amplitude spectra of the time series are used to evaluate the thickness by following the concepts utilized in the Impact Echo (IE) method. Due to the powerful signal extraction capabilities ensured by the multichannel processing schemes used, the entire procedure for all three evaluations can be fully automated and results can be obtained directly in the field. A field data set is used to demonstrate the proposed approach.
NASA Astrophysics Data System (ADS)
Gu, Di; Gao, Simeng; Jiang, Tingting; Wang, Baohui
2017-03-01
To match the relentless pursuit of three research hot points - efficient solar utilization, green and sustainable remediation of wastewater and advanced oxidation processes, solar-mediated thermo-electrochemical oxidation of surfactant was proposed and developed for green remediation of surfactant wastewater. The solar thermal electrochemical process (STEP), fully driven with solar energy to electric energy and heat and without an input of other energy, sustainably serves as efficient thermo-electrochemical oxidation of surfactant, exemplified by SDBS, in wastewater with the synergistic production of hydrogen. The electrooxidation-resistant surfactant is thermo-electrochemically oxidized to CO2 while hydrogen gas is generated by lowing effective oxidation potential and suppressing the oxidation activation energy originated from the combination of thermochemical and electrochemical effect. A clear conclusion on the mechanism of SDBS degradation can be proposed and discussed based on the theoretical analysis of electrochemical potential by quantum chemical method and experimental analysis of the CV, TG, GC, FT-IR, UV-vis, Fluorescence spectra and TOC. The degradation data provide a pilot for the treatment of SDBS wastewater that appears to occur via desulfonation followed by aromatic-ring opening. The solar thermal utilization that can initiate the desulfonation and activation of SDBS becomes one key step in the degradation process.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhu, Yunhua; Jones, Susanne B.; Biddy, Mary J.
2012-08-01
This study reports the comparison of biomass gasification based syngas-to-distillate (S2D) systems using techno-economic analysis (TEA). Three cases, state of technology (SOT) case, goal case, and conventional case, were compared in terms of performance and cost. The SOT case and goal case represent technology being developed at Pacific Northwest National Laboratory for a process starting with syngas using a single-step dual-catalyst reactor for distillate generation (S2D process). The conventional case mirrors the two-step S2D process previously utilized and reported by Mobil using natural gas feedstock and consisting of separate syngas-to-methanol and methanol-to-gasoline (MTG) processes. Analysis of the three cases revealedmore » that the goal case could indeed reduce fuel production cost over the conventional case, but that the SOT was still more expensive than the conventional. The SOT case suffers from low one-pass yield and high selectivity to light hydrocarbons, both of which drive up production cost. Sensitivity analysis indicated that light hydrocarbon yield, single pass conversion efficiency, and reactor space velocity are the key factors driving the high cost for the SOT case.« less
Kryklywy, James H; Macpherson, Ewan A; Mitchell, Derek G V
2018-04-01
Emotion can have diverse effects on behaviour and perception, modulating function in some circumstances, and sometimes having little effect. Recently, it was identified that part of the heterogeneity of emotional effects could be due to a dissociable representation of emotion in dual pathway models of sensory processing. Our previous fMRI experiment using traditional univariate analyses showed that emotion modulated processing in the auditory 'what' but not 'where' processing pathway. The current study aims to further investigate this dissociation using a more recently emerging multi-voxel pattern analysis searchlight approach. While undergoing fMRI, participants localized sounds of varying emotional content. A searchlight multi-voxel pattern analysis was conducted to identify activity patterns predictive of sound location and/or emotion. Relative to the prior univariate analysis, MVPA indicated larger overlapping spatial and emotional representations of sound within early secondary regions associated with auditory localization. However, consistent with the univariate analysis, these two dimensions were increasingly segregated in late secondary and tertiary regions of the auditory processing streams. These results, while complimentary to our original univariate analyses, highlight the utility of multiple analytic approaches for neuroimaging, particularly for neural processes with known representations dependent on population coding.
Performance analysis of CCSDS path service
NASA Technical Reports Server (NTRS)
Johnson, Marjory J.
1989-01-01
A communications service, called Path Service, is currently being developed by the Consultative Committee for Space Data Systems (CCSDS) to provide a mechanism for the efficient transmission of telemetry data from space to ground for complex space missions of the future. This is an important service, due to the large volumes of telemetry data that will be generated during these missions. A preliminary analysis of performance of Path Service is presented with respect to protocol-processing requirements and channel utilization.
System and method for high precision isotope ratio destructive analysis
Bushaw, Bruce A; Anheier, Norman C; Phillips, Jon R
2013-07-02
A system and process are disclosed that provide high accuracy and high precision destructive analysis measurements for isotope ratio determination of relative isotope abundance distributions in liquids, solids, and particulate samples. The invention utilizes a collinear probe beam to interrogate a laser ablated plume. This invention provides enhanced single-shot detection sensitivity approaching the femtogram range, and isotope ratios that can be determined at approximately 1% or better precision and accuracy (relative standard deviation).
Ore minerals textural characterization by hyperspectral imaging
NASA Astrophysics Data System (ADS)
Bonifazi, Giuseppe; Picone, Nicoletta; Serranti, Silvia
2013-02-01
The utilization of hyperspectral detection devices, for natural resources mapping/exploitation through remote sensing techniques, dates back to the early 1970s. From the first devices utilizing a one-dimensional profile spectrometer, HyperSpectral Imaging (HSI) devices have been developed. Thus, from specific-customized devices, originally developed by Governmental Agencies (e.g. NASA, specialized research labs, etc.), a lot of HSI based equipment are today available at commercial level. Parallel to this huge increase of hyperspectral systems development/manufacturing, addressed to airborne application, a strong increase also occurred in developing HSI based devices for "ground" utilization that is sensing units able to play inside a laboratory, a processing plant and/or in an open field. Thanks to this diffusion more and more applications have been developed and tested in this last years also in the materials sectors. Such an approach, when successful, is quite challenging being usually reliable, robust and characterised by lower costs if compared with those usually associated to commonly applied analytical off- and/or on-line analytical approaches. In this paper such an approach is presented with reference to ore minerals characterization. According to the different phases and stages of ore minerals and products characterization, and starting from the analyses of the detected hyperspectral firms, it is possible to derive useful information about mineral flow stream properties and their physical-chemical attributes. This last aspect can be utilized to define innovative process mineralogy strategies and to implement on-line procedures at processing level. The present study discusses the effects related to the adoption of different hardware configurations, the utilization of different logics to perform the analysis and the selection of different algorithms according to the different characterization, inspection and quality control actions to apply.
Ethical implications of digital images for teaching and learning purposes: an integrative review.
Kornhaber, Rachel; Betihavas, Vasiliki; Baber, Rodney J
2015-01-01
Digital photography has simplified the process of capturing and utilizing medical images. The process of taking high-quality digital photographs has been recognized as efficient, timely, and cost-effective. In particular, the evolution of smartphone and comparable technologies has become a vital component in teaching and learning of health care professionals. However, ethical standards in relation to digital photography for teaching and learning have not always been of the highest standard. The inappropriate utilization of digital images within the health care setting has the capacity to compromise patient confidentiality and increase the risk of litigation. Therefore, the aim of this review was to investigate the literature concerning the ethical implications for health professionals utilizing digital photography for teaching and learning. A literature search was conducted utilizing five electronic databases, PubMed, Embase (Excerpta Medica Database), Cumulative Index to Nursing and Allied Health Literature, Educational Resources Information Center, and Scopus, limited to English language. Studies that endeavored to evaluate the ethical implications of digital photography for teaching and learning purposes in the health care setting were included. The search strategy identified 514 papers of which nine were retrieved for full review. Four papers were excluded based on the inclusion criteria, leaving five papers for final analysis. Three key themes were developed: knowledge deficit, consent and beyond, and standards driving scope of practice. The assimilation of evidence in this review suggests that there is value for health professionals utilizing digital photography for teaching purposes in health education. However, there is limited understanding of the process of obtaining and storage and use of such mediums for teaching purposes. Disparity was also highlighted related to policy and guideline identification and development in clinical practice. Therefore, the implementation of policy to guide practice requires further research.
A decision modeling for phasor measurement unit location selection in smart grid systems
NASA Astrophysics Data System (ADS)
Lee, Seung Yup
As a key technology for enhancing the smart grid system, Phasor Measurement Unit (PMU) provides synchronized phasor measurements of voltages and currents of wide-area electric power grid. With various benefits from its application, one of the critical issues in utilizing PMUs is the optimal site selection of units. The main aim of this research is to develop a decision support system, which can be used in resource allocation task for smart grid system analysis. As an effort to suggest a robust decision model and standardize the decision modeling process, a harmonized modeling framework, which considers operational circumstances of component, is proposed in connection with a deterministic approach utilizing integer programming. With the results obtained from the optimal PMU placement problem, the advantages and potential that the harmonized modeling process possesses are assessed and discussed.
Dereshgi, Sina Abedini; Okyay, Ali Kemal
2016-08-08
Plasmonically enhanced absorbing structures have been emerging as strong candidates for photovoltaic (PV) devices. We investigate metal-insulator-metal (MIM) structures that are suitable for tuning spectral absorption properties by modifying layer thicknesses. We have utilized gold and silver nanoparticles to form the top metal (M) region, obtained by dewetting process compatible with large area processes. For the middle (I) and bottom (M) layers, different dielectric materials and metals are investigated. Optimum MIM designs are discussed. We experimentally demonstrate less than 10 percent reflection for most of the visible (VIS) and near infrared (NIR) spectrum. In such stacks, computational analysis shows that the bottom metal is responsible for large portion of absorption with a peak of 80 percent at 1000 nm wavelength for chromium case.
ECG R-R peak detection on mobile phones.
Sufi, F; Fang, Q; Cosic, I
2007-01-01
Mobile phones have become an integral part of modern life. Due to the ever increasing processing power, mobile phones are rapidly expanding its arena from a sole device of telecommunication to organizer, calculator, gaming device, web browser, music player, audio/video recording device, navigator etc. The processing power of modern mobile phones has been utilized by many innovative purposes. In this paper, we are proposing the utilization of mobile phones for monitoring and analysis of biosignal. The computation performed inside the mobile phone's processor will now be exploited for healthcare delivery. We performed literature review on RR interval detection from ECG and selected few PC based algorithms. Then, three of those existing RR interval detection algorithms were programmed on Java platform. Performance monitoring and comparison studies were carried out on three different mobile devices to determine their application on a realtime telemonitoring scenario.
The present situations and perspectives on utilization of research reactors in Thailand
NASA Astrophysics Data System (ADS)
Chongkum, Somporn
2002-01-01
The Thai Research Reactor 1/Modification 1, a TRIGA Mark III reactor, went critical on November 7, 1977. It has been playing a central role in the development of both Office of Atomic Energy for Peace (OAEP) and nuclear application in Thailand. It has a maximum power of 2 MW (thermal) at steady state and a pulsing capacity of 2000 MW. The highest thermal neutron flux at a central thimber is 1×10 13 n/cm 2/s, which is extensively utilized for radioisotope production, neutron activation analysis and neutron beam experiments, i.e. neutron scattering, prompt gamma analysis and neutron radiography. Following the nuclear technological development, the OAEP is in the process of establishing the Ongkharak Nuclear Research Center (ONRC). The center is being built in Nakhon Nayok province, 60 km northeast of Bangkok. The centerpiece of the ONRC is a multipurpose 10 MW TRIGA research reactor. Facilities are included for the production of radioisotopes for medicine, industry and agriculture, neutron transmutation doping of silicon, and neutron capture therapy. The neutron beam facilities will also be utilized for applied research and technology development as well as training in reactor operations, performance of experiments and reactor physics. This paper describes a recent program of utilization as well as a new research reactor for enlarging the perspectives of its utilization in the future.
Software system for data management and distributed processing of multichannel biomedical signals.
Franaszczuk, P J; Jouny, C C
2004-01-01
The presented software is designed for efficient utilization of cluster of PC computers for signal analysis of multichannel physiological data. The system consists of three main components: 1) a library of input and output procedures, 2) a database storing additional information about location in a storage system, 3) a user interface for selecting data for analysis, choosing programs for analysis, and distributing computing and output data on cluster nodes. The system allows for processing multichannel time series data in multiple binary formats. The description of data format, channels and time of recording are included in separate text files. Definition and selection of multiple channel montages is possible. Epochs for analysis can be selected both manually and automatically. Implementation of a new signal processing procedures is possible with a minimal programming overhead for the input/output processing and user interface. The number of nodes in cluster used for computations and amount of storage can be changed with no major modification to software. Current implementations include the time-frequency analysis of multiday, multichannel recordings of intracranial EEG of epileptic patients as well as evoked response analyses of repeated cognitive tasks.
NASA Astrophysics Data System (ADS)
Ikeno, Rimon; Mita, Yoshio; Asada, Kunihiro
2017-04-01
High-throughput electron-beam lithography (EBL) by character projection (CP) and variable-shaped beam (VSB) methods is a promising technique for low-to-medium volume device fabrication with regularly arranged layouts, such as standard-cell logics and memory arrays. However, non-VLSI applications like MEMS and MOEMS may not fully utilize the benefits of CP method due to their wide variety of layout figures including curved and oblique edges. In addition, the stepwise shapes that appear on such irregular edges by VSB exposure often result in intolerable edge roughness, which may degrade performances of the fabricated devices. In our former study, we proposed a general EBL methodology for such applications utilizing a combination of CP and VSB methods, and demonstrated its capabilities in electron beam (EB) shot reduction and edge-quality improvement by using a leading-edge EB exposure tool, ADVANTEST F7000S-VD02, and high-resolution Hydrogen Silsesquioxane resist. Both scanning electron microscope and atomic force microscope observations were used to analyze quality of the resist edge profiles to determine the influence of the control parameters used in the exposure-data preparation process. In this study, we carried out detailed analysis of the captured edge profiles utilizing Fourier analysis, and successfully distinguish the systematic undulation by the exposed CP character profiles from random roughness components. Such capability of precise edge-roughness analysis is useful to our EBL methodology to maintain both the line-edge quality and the exposure throughput by optimizing the control parameters in the layout data conversion.
NASA Astrophysics Data System (ADS)
Wei, J.; Wang, G.; Liu, R.
2008-12-01
The Tarim River Basin is the longest inland river in China. Due to water scarcity, ecologically-fragile is becoming a significant constraint to sustainable development in this region. To effectively manage the limited water resources for ecological purposes and for conventional water utilization purposes, a real-time water resources allocation Decision Support System (DSS) has been developed. Based on workflows of the water resources regulations and comprehensive analysis of the efficiency and feasibility of water management strategies, the DSS includes information systems that perform data acquisition, management and visualization, and model systems that perform hydrological forecast, water demand prediction, flow routing simulation and water resources optimization of the hydrological and water utilization process. An optimization and process control strategy is employed to dynamically allocate the water resources among the different stakeholders. The competitive targets and constraints are taken into considered by multi-objective optimization and with different priorities. The DSS of the Tarim River Basin has been developed and been successfully utilized to support the water resources management of the Tarim River Basin since 2005.
Liu, Lin; Gong, Weili; Sun, Xiaomeng; Chen, Guanjun; Wang, Lushan
2018-02-07
Byproducts of food processing can be utilized for the production of high-value-added enzyme cocktails. In this study, we utilized integrated functional omics technology to analyze composition and functional characteristics of extracellular enzymes produced by Aspergillus niger grown on food processing byproducts. The results showed that oligosaccharides constituted by arabinose, xylose, and glucose in wheat bran were able to efficiently induce the production of extracellular enzymes of A. niger. Compared with other substrates, wheat bran was more effective at inducing the secretion of β-glucosidases from GH1 and GH3 families, as well as >50% of proteases from A1-family aspartic proteases. Compared with proteins induced by single wheat bran or soybean dregs, the protein yield induced by their mixture was doubled, and the time required to reach peak enzyme activity was shortened by 25%. This study provided a technical platform for the complex formulation of various substrates and functional analysis of extracellular enzymes.
Sheldon, E M; Downar, J B
2000-08-15
Novel approaches to the development of analytical procedures for monitoring incoming starting material in support of chemical/pharmaceutical processes are described. High technology solutions were utilized for timely process development and preparation of high quality clinical supplies. A single robust HPLC method was developed and characterized for the analysis of the key starting material from three suppliers. Each supplier used a different process for the preparation of this material and, therefore, each suppliers' material exhibited a unique impurity profile. The HPLC method utilized standard techniques acceptable for release testing in a QC/manufacturing environment. An automated experimental design protocol was used to characterize the robustness of the HPLC method. The method was evaluated for linearity, limit of quantitation, solution stability, and precision of replicate injections. An LC-MS method that emulated the release HPLC method was developed and the identities of impurities were mapped between the two methods.
NASA Technical Reports Server (NTRS)
1979-01-01
Information to identify viable coal gasification and utilization technologies is presented. Analysis capabilities required to support design and implementation of coal based synthetic fuels complexes are identified. The potential market in the Southeast United States for coal based synthetic fuels is investigated. A requirements analysis to identify the types of modeling and analysis capabilities required to conduct and monitor coal gasification project designs is discussed. Models and methodologies to satisfy these requirements are identified and evaluated, and recommendations are developed. Requirements for development of technology and data needed to improve gasification feasibility and economies are examined.
Study of the possibility of thermal utilization of contaminated water in low-power boilers
NASA Astrophysics Data System (ADS)
Roslyakov, P. V.; Proskurin, Y. V.; Zaichenko, M. N.
2017-09-01
The utilization of water contaminated with oil products is a topical problem for thermal power plants and boiler houses. It is reasonable to use special water treatment equipment only for large power engineering and industry facilities. Thermal utilization of contaminated water in boiler furnaces is proposed as an alternative version of its utilization. Since there are hot-water fire-tube boilers at many enterprises, it is necessary to study the possibility of thermal utilization of water contaminated with oil products in their furnaces. The object of this study is a KV-GM-2.0 boiler with a heating power of 2 MW. The pressurized burner developed at the Moscow Power Engineering Institute, National Research University, was used as a burner device for supplying liquid fuel. The computational investigations were performed on the basis of the computer simulation of processes of liquid fuel atomization, mixing, ignition, and burnout; in addition, the formation of nitrogen oxides was simulated on the basis of ANSYS Fluent computational dynamics software packages, taking into account radiative and convective heat transfer. Analysis of the results of numerical experiments on the combined supply of crude oil and water contaminated with oil products has shown that the thermal utilization of contaminated water in fire-tube boilers cannot be recommended. The main causes here are the impingement of oil droplets on the walls of the flame tube, as well as the delay in combustion and increased emissions of nitrogen oxides. The thermal utilization of contaminated water combined with diesel fuel can be arranged provided that the water consumption is not more than 3%; however, this increases the emission of nitrogen oxides. The further increase in contaminated water consumption will lead to the reduction of the reliability of the combustion process.
Handbook of evaluation of utility DSM programs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hirst, E.; Reed, J.; Bronfman, B.
Program evaluation has become a central issue in the world of utility integrated resource planning. The DSM programs that utilities were operating to meet federal requirements or to improve customer relations are now becoming big business. DSM is being considered an important resource in a utility`s portfolio of options. In the last five years, the amount of money that utilities have invested in DSM has grown exponentially in most regulatory jurisdictions. Market analysts are now talking about DSM being a $30 billion industry by the end of the decade. If the large volume of DSM-program investments was not enough tomore » highlight the importance of evaluation, then the introduction of regulatory incentives has really focused the spotlight. This handbook was developed through a process that involved many of those people who represent the diverse constituencies of DSM-program evaluation. We have come to recognize the many technical disciplines that must be employed to evaluate DSM programs. An analysis might start out based on the principles of utility load research to find out what happened, but a combination of engineering and statistical methods must be used to ``triangulate`` an estimate of what would have happened without the program. The difference, of course, is that elusive but prized result of evaluation: what happened as the direct result of the DSM program. Technical performance of DSM measures is not the sole determinant of the answer, either. We also recognize the importance of such behavioral attributes of DSM as persistence and free ridership. Finally, DSM evaluation is meaningless without attention to planning an approach, communicating results to relevant decision-makers, and focusing as much on the process as the impacts of the program. These topics are all covered in this handbook.« less
2012-03-22
world’s first powered and controlled flying machine. Numerous flight designs and tests were done by scientists, engineers, and flight enthusiasts...conceptual flight and preliminary designs before they could control the craft with three-axis control and the correct airfoil design . These pioneers...analysis support. Although wind tunnel testing can provide data to predict and develop control surface designs , few SUAV operators opt to utilize wind
[Quality control in herbal supplements].
Oelker, Luisa
2005-01-01
Quality and safety of food and herbal supplements are the result of a whole of different elements as good manufacturing practice and process control. The process control must be active and able to individuate and correct all possible hazards. The main and most utilized instrument is the hazard analysis critical control point (HACCP) system the correct application of which can guarantee the safety of the product. Herbal supplements need, in addition to standard quality control, a set of checks to assure the harmlessness and safety of the plants used.
Dimensional Precision Research of Wax Molding Rapid Prototyping based on Droplet Injection
NASA Astrophysics Data System (ADS)
Mingji, Huang; Geng, Wu; yan, Shan
2017-11-01
The traditional casting process is complex, the mold is essential products, mold quality directly affect the quality of the product. With the method of rapid prototyping 3D printing to produce mold prototype. The utility wax model has the advantages of high speed, low cost and complex structure. Using the orthogonal experiment as the main method, analysis each factors of size precision. The purpose is to obtain the optimal process parameters, to improve the dimensional accuracy of production based on droplet injection molding.
NASA Technical Reports Server (NTRS)
Stumpf, Richard P.; Arnone, Robert A.; Gould, Richard W., Jr.; Ransibrahmanakul, Varis; Tester, Patricia A.
2003-01-01
SeaWiFS has the ability to enhance our understanding of many oceanographic processes. However, its utility in the coastal zone has been limited by valid bio-optical algorithms and by the determination of accurate water reflectances, particularly in the blue bands (412-490 nm), which have a significant impact on the effectiveness of all bio-optical algorithms. We have made advances in three areas: algorithm development (Table 16.1), field data collection, and data applications.
Scrutinizing UML Activity Diagrams
NASA Astrophysics Data System (ADS)
Al-Fedaghi, Sabah
Building an information system involves two processes: conceptual modeling of the “real world domain” and designing the software system. Object-oriented methods and languages (e.g., UML) are typically used for describing the software system. For the system analysis process that produces the conceptual description, object-oriented techniques or semantics extensions are utilized. Specifically, UML activity diagrams are the “flow charts” of object-oriented conceptualization tools. This chapter proposes an alternative to UML activity diagrams through the development of a conceptual modeling methodology based on the notion of flow.
A holistic framework for design of cost-effective minimum water utilization network.
Wan Alwi, S R; Manan, Z A; Samingin, M H; Misran, N
2008-07-01
Water pinch analysis (WPA) is a well-established tool for the design of a maximum water recovery (MWR) network. MWR, which is primarily concerned with water recovery and regeneration, only partly addresses water minimization problem. Strictly speaking, WPA can only lead to maximum water recovery targets as opposed to the minimum water targets as widely claimed by researchers over the years. The minimum water targets can be achieved when all water minimization options including elimination, reduction, reuse/recycling, outsourcing and regeneration have been holistically applied. Even though WPA has been well established for synthesis of MWR network, research towards holistic water minimization has lagged behind. This paper describes a new holistic framework for designing a cost-effective minimum water network (CEMWN) for industry and urban systems. The framework consists of five key steps, i.e. (1) Specify the limiting water data, (2) Determine MWR targets, (3) Screen process changes using water management hierarchy (WMH), (4) Apply Systematic Hierarchical Approach for Resilient Process Screening (SHARPS) strategy, and (5) Design water network. Three key contributions have emerged from this work. First is a hierarchical approach for systematic screening of process changes guided by the WMH. Second is a set of four new heuristics for implementing process changes that considers the interactions among process changes options as well as among equipment and the implications of applying each process change on utility targets. Third is the SHARPS cost-screening technique to customize process changes and ultimately generate a minimum water utilization network that is cost-effective and affordable. The CEMWN holistic framework has been successfully implemented on semiconductor and mosque case studies and yielded results within the designer payback period criterion.
Patient complaints in healthcare services in Vietnam’s health system
Thi Thu Ha, Bui; Mirzoev, Tolib; Morgan, Rosemary
2015-01-01
Background: There is growing recognition of patient rights in health sectors around the world. Patients’ right to complain in hospitals, often visible in legislative and regulatory protocols, can be an important information source for service quality improvement and achievement of better health outcomes. However, empirical evidence on complaint processes is scarce, particularly in the developing countries. To contribute in addressing this gap, we investigated patients’ complaint handling processes and the main influences on their implementation in public hospitals in Vietnam. Methods: The study was conducted in two provinces of Vietnam. We focused specifically on the implementation of the Law on Complaints and Denunciations and the Ministry of Health regulation on resolving complaints in the health sector. The data were collected using document review and in-depth interviews with key respondents. Framework approach was used for data analysis, guided by a conceptual framework and aided by qualitative data analysis software. Results: Five steps of complaint handling were implemented, which varied in practice between the provinces. Four groups of factors influenced the procedures: (1) insufficient investment in complaint handling procedures; (2) limited monitoring of complaint processes; (3) patients’ low awareness of, and perceived lack of power to change, complaint procedures and (4) autonomization pressures on local health facilities. While the existence of complaint handling processes is evident in the health system in Vietnam, their utilization was often limited. Different factors which constrained the implementation and use of complaint regulations included health system–related issues as well as social and cultural influences. Conclusion: The study aimed to contribute to improved understanding of complaint handling processes and the key factors influencing these processes in public hospitals in Vietnam. Specific policy implications for improving these processes were proposed, which include improving accountability of service providers and better utilization of information on complaints. PMID:26770804
Parsons, D.R.; Jackson, P.R.; Czuba, J.A.; Engel, F.L.; Rhoads, B.L.; Oberg, K.A.; Best, J.L.; Mueller, D.S.; Johnson, K.K.; Riley, J.D.
2013-01-01
The use of acoustic Doppler current profilers (ADCP) for discharge measurements and three-dimensional flow mapping has increased rapidly in recent years and has been primarily driven by advances in acoustic technology and signal processing. Recent research has developed a variety of methods for processing data obtained from a range of ADCP deployments and this paper builds on this progress by describing new software for processing and visualizing ADCP data collected along transects in rivers or other bodies of water. The new utility, the Velocity Mapping Toolbox (VMT), allows rapid processing (vector rotation, projection, averaging and smoothing), visualization (planform and cross-section vector and contouring), and analysis of a range of ADCP-derived datasets. The paper documents the data processing routines in the toolbox and presents a set of diverse examples that demonstrate its capabilities. The toolbox is applicable to the analysis of ADCP data collected in a wide range of aquatic environments and is made available as open-source code along with this publication.
Fungal fermentation on anaerobic digestate for lipid-based biofuel production.
Zhong, Yuan; Liu, Zhiguo; Isaguirre, Christine; Liu, Yan; Liao, Wei
2016-01-01
Anaerobic digestate is the effluent from anaerobic digestion of organic wastes. It contains a significant amount of nutrients and lignocellulosic materials, even though anaerobic digestion consumed a large portion of organic matters in the wastes. Utilizing the nutrients and lignocellulosic materials in the digestate is critical to significantly improve efficiency of anaerobic digestion technology and generate value-added chemical and fuel products from the organic wastes. Therefore, this study focused on developing an integrated process that uses biogas energy to power fungal fermentation and converts remaining carbon sources, nutrients, and water in the digestate into biofuel precursor-lipid. The process contains two unit operations of anaerobic digestion and digestate utilization. The digestate utilization includes alkali treatment of the mixture feed of solid and liquid digestates, enzymatic hydrolysis for mono-sugar release, overliming detoxification, and fungal fermentation for lipid accumulation. The experimental results conclude that 5 h and 30 °C were the preferred conditions for the overliming detoxification regarding lipid accumulation of the following fungal cultivation. The repeated-batch fungal fermentation enhanced lipid accumulation, which led to a final lipid concentration of 3.16 g/L on the digestate with 10% dry matter. The mass and energy balance analysis further indicates that the digestate had enough water for the process uses and the biogas energy was able to balance the needs of individual unit operations. A fresh-water-free and energy-positive process of lipid production from anaerobic digestate was achieved by integrating anaerobic digestion and fungal fermentation. The integration addresses the issues that both biofuel industry and waste management encounter-high water and energy demand of biofuel precursor production and few digestate utilization approaches of organic waste treatment.
Real-time fMRI processing with physiological noise correction - Comparison with off-line analysis.
Misaki, Masaya; Barzigar, Nafise; Zotev, Vadim; Phillips, Raquel; Cheng, Samuel; Bodurka, Jerzy
2015-12-30
While applications of real-time functional magnetic resonance imaging (rtfMRI) are growing rapidly, there are still limitations in real-time data processing compared to off-line analysis. We developed a proof-of-concept real-time fMRI processing (rtfMRIp) system utilizing a personal computer (PC) with a dedicated graphic processing unit (GPU) to demonstrate that it is now possible to perform intensive whole-brain fMRI data processing in real-time. The rtfMRIp performs slice-timing correction, motion correction, spatial smoothing, signal scaling, and general linear model (GLM) analysis with multiple noise regressors including physiological noise modeled with cardiac (RETROICOR) and respiration volume per time (RVT). The whole-brain data analysis with more than 100,000voxels and more than 250volumes is completed in less than 300ms, much faster than the time required to acquire the fMRI volume. Real-time processing implementation cannot be identical to off-line analysis when time-course information is used, such as in slice-timing correction, signal scaling, and GLM. We verified that reduced slice-timing correction for real-time analysis had comparable output with off-line analysis. The real-time GLM analysis, however, showed over-fitting when the number of sampled volumes was small. Our system implemented real-time RETROICOR and RVT physiological noise corrections for the first time and it is capable of processing these steps on all available data at a given time, without need for recursive algorithms. Comprehensive data processing in rtfMRI is possible with a PC, while the number of samples should be considered in real-time GLM. Copyright © 2015 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kaygusuz, K.
Exergy analysis is a general method for efficiency analysis of systems and processes. The use of the exergy concept and the analysis of ultimate efficiencies of processes is more or less still limited to the academic world. There are several reasons why its industrial use is still limited. To overcome some of the difficulties in industrial applications of energy analysis, it has made use of exergy analysis. The chemical exergy of a substance is the maximum work that can be obtained from it by taking it to chemical equilibrium with the reference environment at a constant temperature and pressure. Themore » first law analysis gives only the quantity of energy, while the second law defines the quality of energy also. The projected increase in coal utilization in power plants makes it desirable to evaluate the energy content of coal both quantitatively and qualitatively. In the present study, the chemical exergies of some coals of good quality in Turkey were calculated with the BASIC program by using second law analysis and the results were given as tabulated.« less
Sensitivity analysis of add-on price estimate for select silicon wafering technologies
NASA Technical Reports Server (NTRS)
Mokashi, A. R.
1982-01-01
The cost of producing wafers from silicon ingots is a major component of the add-on price of silicon sheet. Economic analyses of the add-on price estimates and their sensitivity internal-diameter (ID) sawing, multiblade slurry (MBS) sawing and fixed-abrasive slicing technique (FAST) are presented. Interim price estimation guidelines (IPEG) are used for estimating a process add-on price. Sensitivity analysis of price is performed with respect to cost parameters such as equipment, space, direct labor, materials (blade life) and utilities, and the production parameters such as slicing rate, slices per centimeter and process yield, using a computer program specifically developed to do sensitivity analysis with IPEG. The results aid in identifying the important cost parameters and assist in deciding the direction of technology development efforts.
Coupled Loads Analysis of the Modified NASA Barge Pegasus and Space Launch System Hardware
NASA Technical Reports Server (NTRS)
Knight, J. Brent
2015-01-01
A Coupled Loads Analysis (CLA) has been performed for barge transport of Space Launch System hardware on the recently modified NASA barge Pegasus. The barge re-design was facilitated with detailed finite element analyses by the ARMY Corps of Engineers - Marine Design Center. The Finite Element Model (FEM) utilized in the design was also used in the subject CLA. The Pegasus FEM and CLA results are presented as well as a comparison of the analysis process to that of a payload being transported to space via the Space Shuttle. Discussion of the dynamic forcing functions is included as well. The process of performing a dynamic CLA of NASA hardware during marine transport is thought to be a first and can likely support minimization of undue conservatism.
Zhang, Hanyuan; Tian, Xuemin; Deng, Xiaogang; Cao, Yuping
2018-05-16
As an attractive nonlinear dynamic data analysis tool, global preserving kernel slow feature analysis (GKSFA) has achieved great success in extracting the high nonlinearity and inherently time-varying dynamics of batch process. However, GKSFA is an unsupervised feature extraction method and lacks the ability to utilize batch process class label information, which may not offer the most effective means for dealing with batch process monitoring. To overcome this problem, we propose a novel batch process monitoring method based on the modified GKSFA, referred to as discriminant global preserving kernel slow feature analysis (DGKSFA), by closely integrating discriminant analysis and GKSFA. The proposed DGKSFA method can extract discriminant feature of batch process as well as preserve global and local geometrical structure information of observed data. For the purpose of fault detection, a monitoring statistic is constructed based on the distance between the optimal kernel feature vectors of test data and normal data. To tackle the challenging issue of nonlinear fault variable identification, a new nonlinear contribution plot method is also developed to help identifying the fault variable after a fault is detected, which is derived from the idea of variable pseudo-sample trajectory projection in DGKSFA nonlinear biplot. Simulation results conducted on a numerical nonlinear dynamic system and the benchmark fed-batch penicillin fermentation process demonstrate that the proposed process monitoring and fault diagnosis approach can effectively detect fault and distinguish fault variables from normal variables. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.
Prediction of Adequate Prenatal Care Utilization Based on the Extended Parallel Process Model
Hajian, Sepideh; Imani, Fatemeh; Riazi, Hedyeh; Salmani, Fatemeh
2017-01-01
ABSTRACT Background: Pregnancy complications are one of the major public health concerns. One of the main causes of preventable complications is the absence of or inadequate provision of prenatal care. The present study was conducted to investigate whether Extended Parallel Process Model’s constructs can predict the utilization of prenatal care services. Methods: The present longitudinal prospective study was conducted on 192 pregnant women selected through the multi-stage sampling of health facilities in Qeshm, Hormozgan province, from April to June 2015. Participants were followed up from the first half of pregnancy until their childbirth to assess adequate or inadequate/non-utilization of prenatal care services. Data were collected using the structured Risk Behavior Diagnosis Scale. The analysis of the data was carried out in SPSS-22 using one-way ANOVA, linear regression and logistic regression analysis. The level of significance was set at 0.05. Results: Totally, 178 pregnant women with a mean age of 25.31±5.42 completed the study. Perceived self-efficacy (OR=25.23; P<0.001) and perceived susceptibility (OR=0.048; P<0.001) were two predictors of the intention to utilize prenatal care. Husband’s occupation in the labor market (OR=0.43; P=0.02), unwanted pregnancy (OR=0.352; P<0.001), and the need to care for the minors or elderly at home (OR=0.35; P=0.045) were associated with lower odds of receiving prenatal care. Conclusion: The model showed that when perceived efficacy of the prenatal care services overcame the perceived threat, the likelihood of prenatal care usage will increase. This study identified some modifiable factors associated with prenatal care usage by women, providing key targets for appropriate clinical interventions. PMID:29043280
Kittelmann, Jörg; Ottens, Marcel; Hubbuch, Jürgen
2015-04-15
High-throughput batch screening technologies have become an important tool in downstream process development. Although continuative miniaturization saves time and sample consumption, there is yet no screening process described in the 384-well microplate format. Several processes are established in the 96-well dimension to investigate protein-adsorbent interactions, utilizing between 6.8 and 50 μL resin per well. However, as sample consumption scales with resin volumes and throughput scales with experiments per microplate, they are limited in costs and saved time. In this work, a new method for in-well resin quantification by optical means, applicable in the 384-well format, and resin volumes as small as 0.1 μL is introduced. A HTS batch isotherm process is described, utilizing this new method in combination with optical sample volume quantification for screening of isotherm parameters in 384-well microplates. Results are qualified by confidence bounds determined by bootstrap analysis and a comprehensive Monte Carlo study of error propagation. This new approach opens the door to a variety of screening processes in the 384-well format on HTS stations, higher quality screening data and an increase in throughput. Copyright © 2015 Elsevier B.V. All rights reserved.
Smartphone Analytics: Mobilizing the Lab into the Cloud for Omic-Scale Analyses.
Montenegro-Burke, J Rafael; Phommavongsay, Thiery; Aisporna, Aries E; Huan, Tao; Rinehart, Duane; Forsberg, Erica; Poole, Farris L; Thorgersen, Michael P; Adams, Michael W W; Krantz, Gregory; Fields, Matthew W; Northen, Trent R; Robbins, Paul D; Niedernhofer, Laura J; Lairson, Luke; Benton, H Paul; Siuzdak, Gary
2016-10-04
Active data screening is an integral part of many scientific activities, and mobile technologies have greatly facilitated this process by minimizing the reliance on large hardware instrumentation. In order to meet with the increasingly growing field of metabolomics and heavy workload of data processing, we designed the first remote metabolomic data screening platform for mobile devices. Two mobile applications (apps), XCMS Mobile and METLIN Mobile, facilitate access to XCMS and METLIN, which are the most important components in the computer-based XCMS Online platforms. These mobile apps allow for the visualization and analysis of metabolic data throughout the entire analytical process. Specifically, XCMS Mobile and METLIN Mobile provide the capabilities for remote monitoring of data processing, real time notifications for the data processing, visualization and interactive analysis of processed data (e.g., cloud plots, principle component analysis, box-plots, extracted ion chromatograms, and hierarchical cluster analysis), and database searching for metabolite identification. These apps, available on Apple iOS and Google Android operating systems, allow for the migration of metabolomic research onto mobile devices for better accessibility beyond direct instrument operation. The utility of XCMS Mobile and METLIN Mobile functionalities was developed and is demonstrated here through the metabolomic LC-MS analyses of stem cells, colon cancer, aging, and bacterial metabolism.
Smartphone Analytics: Mobilizing the Lab into the Cloud for Omic-Scale Analyses
2016-01-01
Active data screening is an integral part of many scientific activities, and mobile technologies have greatly facilitated this process by minimizing the reliance on large hardware instrumentation. In order to meet with the increasingly growing field of metabolomics and heavy workload of data processing, we designed the first remote metabolomic data screening platform for mobile devices. Two mobile applications (apps), XCMS Mobile and METLIN Mobile, facilitate access to XCMS and METLIN, which are the most important components in the computer-based XCMS Online platforms. These mobile apps allow for the visualization and analysis of metabolic data throughout the entire analytical process. Specifically, XCMS Mobile and METLIN Mobile provide the capabilities for remote monitoring of data processing, real time notifications for the data processing, visualization and interactive analysis of processed data (e.g., cloud plots, principle component analysis, box-plots, extracted ion chromatograms, and hierarchical cluster analysis), and database searching for metabolite identification. These apps, available on Apple iOS and Google Android operating systems, allow for the migration of metabolomic research onto mobile devices for better accessibility beyond direct instrument operation. The utility of XCMS Mobile and METLIN Mobile functionalities was developed and is demonstrated here through the metabolomic LC-MS analyses of stem cells, colon cancer, aging, and bacterial metabolism. PMID:27560777
Smartphone Analytics: Mobilizing the Lab into the Cloud for Omic-Scale Analyses
Montenegro-Burke, J. Rafael; Phommavongsay, Thiery; Aisporna, Aries E.; ...
2016-08-25
Active data screening is an integral part of many scientific activities, and mobile technologies have greatly facilitated this process by minimizing the reliance on large hardware instrumentation. In order to meet with the increasingly growing field of metabolomics and heavy workload of data processing, we designed the first remote metabolomic data screening platform for mobile devices. Two mobile applications (apps), XCMS Mobile and METLIN Mobile, facilitate access to XCMS and METLIN, which are the most important components in the computer-based XCMS Online platforms. These mobile apps allow for the visualization and analysis of metabolic data throughout the entire analytical process.more » Specifically, XCMS Mobile and METLIN Mobile provide the capabilities for remote monitoring of data processing, real time notifications for the data processing, visualization and interactive analysis of processed data (e.g., cloud plots, principle component analysis, box-plots, extracted ion chromatograms, and hierarchical cluster analysis), and database searching for metabolite identification. These apps, available on Apple iOS and Google Android operating systems, allow for the migration of metabolomic research onto mobile devices for better accessibility beyond direct instrument operation. The utility of XCMS Mobile and METLIN Mobile functionalities was developed and is demonstrated here through the metabolomic LC-MS analyses of stem cells, colon cancer, aging, and bacterial metabolism.« less
Smartphone Analytics: Mobilizing the Lab into the Cloud for Omic-Scale Analyses
DOE Office of Scientific and Technical Information (OSTI.GOV)
Montenegro-Burke, J. Rafael; Phommavongsay, Thiery; Aisporna, Aries E.
Active data screening is an integral part of many scientific activities, and mobile technologies have greatly facilitated this process by minimizing the reliance on large hardware instrumentation. In order to meet with the increasingly growing field of metabolomics and heavy workload of data processing, we designed the first remote metabolomic data screening platform for mobile devices. Two mobile applications (apps), XCMS Mobile and METLIN Mobile, facilitate access to XCMS and METLIN, which are the most important components in the computer-based XCMS Online platforms. These mobile apps allow for the visualization and analysis of metabolic data throughout the entire analytical process.more » Specifically, XCMS Mobile and METLIN Mobile provide the capabilities for remote monitoring of data processing, real time notifications for the data processing, visualization and interactive analysis of processed data (e.g., cloud plots, principle component analysis, box-plots, extracted ion chromatograms, and hierarchical cluster analysis), and database searching for metabolite identification. These apps, available on Apple iOS and Google Android operating systems, allow for the migration of metabolomic research onto mobile devices for better accessibility beyond direct instrument operation. The utility of XCMS Mobile and METLIN Mobile functionalities was developed and is demonstrated here through the metabolomic LC-MS analyses of stem cells, colon cancer, aging, and bacterial metabolism.« less
A synoptic description of coal basins via image processing
NASA Technical Reports Server (NTRS)
Farrell, K. W., Jr.; Wherry, D. B.
1978-01-01
An existing image processing system is adapted to describe the geologic attributes of a regional coal basin. This scheme handles a map as if it were a matrix, in contrast to more conventional approaches which represent map information in terms of linked polygons. The utility of the image processing approach is demonstrated by a multiattribute analysis of the Herrin No. 6 coal seam in Illinois. Findings include the location of a resource and estimation of tonnage corresponding to constraints on seam thickness, overburden, and Btu value, which are illustrative of the need for new mining technology.
Classification of cognitive systems dedicated to data sharing
NASA Astrophysics Data System (ADS)
Ogiela, Lidia; Ogiela, Marek R.
2017-08-01
In this paper will be presented classification of new cognitive information systems dedicated to cryptographic data splitting and sharing processes. Cognitive processes of semantic data analysis and interpretation, will be used to describe new classes of intelligent information and vision systems. In addition, cryptographic data splitting algorithms and cryptographic threshold schemes will be used to improve processes of secure and efficient information management with application of such cognitive systems. The utility of the proposed cognitive sharing procedures and distributed data sharing algorithms will be also presented. A few possible application of cognitive approaches for visual information management and encryption will be also described.
A Theoretical and Experimental Analysis of the Outside World Perception Process
NASA Technical Reports Server (NTRS)
Wewerinke, P. H.
1978-01-01
The outside scene is often an important source of information for manual control tasks. Important examples of these are car driving and aircraft control. This paper deals with modelling this visual scene perception process on the basis of linear perspective geometry and the relative motion cues. Model predictions utilizing psychophysical threshold data from base-line experiments and literature of a variety of visual approach tasks are compared with experimental data. Both the performance and workload results illustrate that the model provides a meaningful description of the outside world perception process, with a useful predictive capability.
Advances in interpretation of subsurface processes with time-lapse electrical imaging
Singha, Kaminit; Day-Lewis, Frederick D.; Johnson, Tim B.; Slater, Lee D.
2015-01-01
Electrical geophysical methods, including electrical resistivity, time-domain induced polarization, and complex resistivity, have become commonly used to image the near subsurface. Here, we outline their utility for time-lapse imaging of hydrological, geochemical, and biogeochemical processes, focusing on new instrumentation, processing, and analysis techniques specific to monitoring. We review data collection procedures, parameters measured, and petrophysical relationships and then outline the state of the science with respect to inversion methodologies, including coupled inversion. We conclude by highlighting recent research focused on innovative applications of time-lapse imaging in hydrology, biology, ecology, and geochemistry, among other areas of interest.
Advances in interpretation of subsurface processes with time-lapse electrical imaging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Singha, Kamini; Day-Lewis, Frederick D.; Johnson, Timothy C.
2015-03-15
Electrical geophysical methods, including electrical resistivity, time-domain induced polarization, and complex resistivity, have become commonly used to image the near subsurface. Here, we outline their utility for time-lapse imaging of hydrological, geochemical, and biogeochemical processes, focusing on new instrumentation, processing, and analysis techniques specific to monitoring. We review data collection procedures, parameters measured, and petrophysical relationships and then outline the state of the science with respect to inversion methodologies, including coupled inversion. We conclude by highlighting recent research focused on innovative applications of time-lapse imaging in hydrology, biology, ecology, and geochemistry, among other areas of interest.
Extraterrestrial consumables production and utilization
NASA Technical Reports Server (NTRS)
Sanders, A. P.
1972-01-01
Potential oxygen requirements for lunar-surface, lunar-orbit, and planetary missions are presented with emphasis on: (1) emergency survival of the crew, (2) provision of energy consumables for vehicles, and (3) nondependency on an earth supply of oxygen. Although many extraterrestrial resource processes are analytically feasible, this study has considered hydrogen and fluorine processing concepts to obtain oxygen or water (or both). The results are quite encouraging and are extrapolatable to other processes. Preliminary mission planning and sequencing analysis has enabled the programmatic evaluation of using lunar-derived oxygen relative to transportation cost as a function of vehicle delivery and operational capability.
Bayesian approaches for Integrated Water Resources Management. A Mediterranean case study.
NASA Astrophysics Data System (ADS)
Gulliver, Zacarías; Herrero, Javier; José Polo, María
2013-04-01
This study presents the first steps of a short-term/mid-term analysis of the water resources in the Guadalfeo Basin, Spain. Within the basin the recent construction of the Rules dam has required the development of specific management tools and structures for this water system. The climate variability and the high water demand requirements for agriculture irrigation and tourism in this region may cause different controversies in the water management planning process. During the first stages of the study a rigorous analysis of the Water Framework Directive results was done in order to implement the legal requirements and the solutions for the gaps identified by the water authorities. In addition, the stakeholders and water experts identified the variables and geophysical processes for our specific water system case. These particularities need to be taken into account and are required to be reflected in the final computational tool. For decision making process purposes in a mid-term scale, a bayesian network has been used to quantify uncertainty which also provides a structure representation of probabilities, actions-decisions and utilities. On one hand by applying these techniques it is possible the inclusion of decision rules generating influence diagrams that provides clear and coherent semantics for the value of making an observation. On the other hand the utility nodes encode the stakeholders preferences which are measured on a numerical scale, choosing the action that maximizes the expected utility [MEU]. Also this graphical model allows us to identify gaps and project corrective measures, for example, formulating associated scenarios with different event hypotheses. In this sense conditional probability distributions of the seasonal water demand and waste water has been obtained between the established intervals. This fact will give to the regional water managers useful information for future decision making process. The final display is very visual and allows the user to understand quickly the model and the causal relationships between the existing nodes and variables. The input data were collected from the local monitoring networks and the unmonitored data has been generated with a physically based spatially distributed hydrological model WiMMed, which is validated and calibrated. For short-term purposes, pattern analysis has been applied for the management of extreme events scenarios, techniques as Bayesian Neural Networks (BNN) or Gaussian Processes (GP) giving accuracy on the predictions.
Functional Analysis of the Aspergillus nidulans Kinome
De Souza, Colin P.; Hashmi, Shahr B.; Osmani, Aysha H.; Andrews, Peter; Ringelberg, Carol S.; Dunlap, Jay C.; Osmani, Stephen A.
2013-01-01
The filamentous fungi are an ecologically important group of organisms which also have important industrial applications but devastating effects as pathogens and agents of food spoilage. Protein kinases have been implicated in the regulation of virtually all biological processes but how they regulate filamentous fungal specific processes is not understood. The filamentous fungus Aspergillus nidulans has long been utilized as a powerful molecular genetic system and recent technical advances have made systematic approaches to study large gene sets possible. To enhance A. nidulans functional genomics we have created gene deletion constructs for 9851 genes representing 93.3% of the encoding genome. To illustrate the utility of these constructs, and advance the understanding of fungal kinases, we have systematically generated deletion strains for 128 A. nidulans kinases including expanded groups of 15 histidine kinases, 7 SRPK (serine-arginine protein kinases) kinases and an interesting group of 11 filamentous fungal specific kinases. We defined the terminal phenotype of 23 of the 25 essential kinases by heterokaryon rescue and identified phenotypes for 43 of the 103 non-essential kinases. Uncovered phenotypes ranged from almost no growth for a small number of essential kinases implicated in processes such as ribosomal biosynthesis, to conditional defects in response to cellular stresses. The data provide experimental evidence that previously uncharacterized kinases function in the septation initiation network, the cell wall integrity and the morphogenesis Orb6 kinase signaling pathways, as well as in pathways regulating vesicular trafficking, sexual development and secondary metabolism. Finally, we identify ChkC as a third effector kinase functioning in the cellular response to genotoxic stress. The identification of many previously unknown functions for kinases through the functional analysis of the A. nidulans kinome illustrates the utility of the A. nidulans gene deletion constructs. PMID:23505451
Medical Management: Process Analysis Study Report
2011-10-28
in Medical Management (care coordinator, case manager, PCM, clinic nurses , referral management shop, utilization management?, etc). The goal is to...Enterprise Nursing Procedure Manual, revealed that fact from the Navy’s perspective. An OASD(HA) TRICARE Management Activity (TMA) Senior...Requirements Analyst, Clinical Information Management (IM) and retired Army Colonel Nurse , Patricia Kinder, essentially told us no single application suite
Working group organizational meeting
NASA Technical Reports Server (NTRS)
1982-01-01
Scene radiation and atmospheric effects, mathematical pattern recognition and image analysis, information evaluation and utilization, and electromagnetic measurements and signal handling are considered. Research issues in sensors and signals, including radar (SAR) reflectometry, SAR processing speed, registration, including overlay of SAR and optical imagery, entire system radiance calibration, and lack of requirements for both sensors and systems, etc. were discussed.
Sari C. Saunders; Jiquan Chen; Thomas D. Drummer; Eric J. Gustafson; Kimberley D. Brosofske
2005-01-01
Identifying scales of pattern in ecological systems and coupling patterns to processes that create them are ongoing challenges. We examined the utility of three techniques (lacunarity, spectral, and wavelet analysis) for detecting scales of pattern of ecological data. We compared the information obtained using these methods for four datasets, including: surface...
Analysis of the Integration of Skill Standards into Community College Curriculum
ERIC Educational Resources Information Center
Aragon, Steven R.; Woo, Hui-Jeong; Marvel, Matthew R.
2004-01-01
The utilization of skill standards in the curriculum development process has become an increasingly prominent aspect of the reform movement in career and technical education over the past 10 years. Standards are seen as a way to achieve better accountability within Career and Technical Education (CTE) systems, and improve their quality as well as…
Analysis of the Integration of Skill Standards into Community College Curriculum
ERIC Educational Resources Information Center
Aragon, Steven R.; Woo, Hui-Jeong; Marvel, Matthew R.
2005-01-01
The utilization of skill standards in the curriculum development process has become an increasingly prominent aspect of the reform movement in career and technical education (CTE) over the past 10 years. Data were collected across 10 CTE program areas from a nationally representative sample of community colleges. The authors discuss the extent to…
USDA-ARS?s Scientific Manuscript database
In this study, a process model of a 2000 metric ton per day (MTPD) eucalyptus Tail Gas Reactive Pyrolysis (TGRP) and electricity generation plant was developed and simulated in SimSci Pro/II software for the purpose of evaluating its techno-economic viability in Brazil. Two scenarios were compared b...
Spacecraft detumbling through energy dissipation
NASA Technical Reports Server (NTRS)
Fitz-Coy, Norman; Chatterjee, Anindya
1993-01-01
The attitude motion of a tumbling, rigid, axisymmetric spacecraft is considered. A methodology for detumbling the spacecraft through energy dissipation is presented. The differential equations governing this motion are stiff, and therefore an approximate solution, based on the variation of constants method, is developed and utilized in the analysis of the detumbling strategy. Stability of the detumbling process is also addressed.
ERIC Educational Resources Information Center
Toutkoushian, Robert K.
This paper proposes a five-step process by which to analyze whether the salary ratio between junior and senior college faculty exhibits salary compression, a term used to describe an unusually small differential between faculty with different levels of experience. The procedure utilizes commonly used statistical techniques (multiple regression…
The Utility of the Child and Adolescent Psychopathy Construct in Hong Kong, China
ERIC Educational Resources Information Center
Fung, Annis Lai-Chu; Gao, Yu; Raine, Adrian
2010-01-01
This cross-sectional study examined the nature of child and adolescent psychopathy using the Antisocial Process Screening Device (APSD) in 3,675 schoolchildren (ages 11-16) in Hong Kong, China. A confirmatory factor analysis observed a good fit for the three-factor model (callous-unemotional, impulsivity, narcissism) of APSD, with boys scoring…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tan, Eric C.D.
This paper presents a comparative techno-economic analysis of four emerging conversion pathways from biomass to gasoline-, jet-, and diesel-range hydrocarbons via indirect liquefaction with specific focus on pathways utilizing oxygenated intermediates. The processing steps include: biomass-to-syngas via indirect gasification, gas cleanup, conversion of syngas to alcohols/oxygenates followed by conversion of alcohols/oxygenates to hydrocarbon blendstocks via dehydration, oligomerization, and hydrogenation.
ERIC Educational Resources Information Center
Newton, Xiaoxia A.
2010-01-01
This paper reported results from a generalizability study that examined the process of developing classroom practice indicators used to evaluate the impact of a school district's mathematics reform initiative. The study utilized classroom observational data from 32 second, fourth, eighth, and tenth grade teachers. The study addresses important…
An Exploratory Analysis of the U.S. System of Major Defense Acquisition Utilizing the CLIOS Process
2009-09-01
SPENDING COUNTRIES .............................................20 1. The United States’ Defense Acquisition System..............................21 2...Improvement Initiatives ...............................................................20 Table 4. The Top 15 Military Spender Countries in 2008...other top military spending countries . It will end with a review of the major defense acquisition literature. This literature review will focus on
Methodological challenges collecting parent phone-call healthcare utilization data.
Moreau, Paula; Crawford, Sybil; Sullivan-Bolyai, Susan
2016-02-01
Recommendations by the National Institute of Nursing Research and other groups have strongly encouraged nurses to pay greater attention to cost-effectiveness analysis when conducting research. Given the increasing prominence of translational science and comparative effective research, cost-effective analysis has become a basic tool in determining intervention value in research. Tracking phone-call communication (number of calls and context) with cross-checks between parents and healthcare providers is an example of this type of healthcare utilization data collection. This article identifies some methodological challenges that have emerged in the process of collecting this type of data in a randomized controlled trial: Parent education Through Simulation-Diabetes (PETS-D). We also describe ways in which those challenges have been addressed with comparison data results, and make recommendations for future research. Copyright © 2015 Elsevier Inc. All rights reserved.
The lost steps of infancy: symbolization, analytic process and the growth of the self.
Feldman, Brian
2002-07-01
In 'The Lost Steps' the Latin American novelist Alejo Carpentier describes the search by the protagonist for the origins of music among native peoples in the Amazon jungle. This metaphor can be utilized as a way of understanding the search for the pre-verbal origins of the self in analysis. The infant's experience of the tempo and rhythmicity of the mother/infant interaction and the bathing in words and sounds of the infant by the mother are at the core of the infant's development of the self. The infant observation method (Tavistock model) will be looked at as a way of developing empathy in the analyst to better understand infantile, pre-verbal states of mind. A case vignette from an adult analysis will be utilized to illustrate the theoretical concepts.
Automated Meteor Detection by All-Sky Digital Camera Systems
NASA Astrophysics Data System (ADS)
Suk, Tomáš; Šimberová, Stanislava
2017-12-01
We have developed a set of methods to detect meteor light traces captured by all-sky CCD cameras. Operating at small automatic observatories (stations), these cameras create a network spread over a large territory. Image data coming from these stations are merged in one central node. Since a vast amount of data is collected by the stations in a single night, robotic storage and analysis are essential to processing. The proposed methodology is adapted to data from a network of automatic stations equipped with digital fish-eye cameras and includes data capturing, preparation, pre-processing, analysis, and finally recognition of objects in time sequences. In our experiments we utilized real observed data from two stations.
NASA Astrophysics Data System (ADS)
Mamat, Siti Salwana; Ahmad, Tahir; Awang, Siti Rahmah
2017-08-01
Analytic Hierarchy Process (AHP) is a method used in structuring, measuring and synthesizing criteria, in particular ranking of multiple criteria in decision making problems. On the other hand, Potential Method is a ranking procedure in which utilizes preference graph ς (V, A). Two nodes are adjacent if they are compared in a pairwise comparison whereby the assigned arc is oriented towards the more preferred node. In this paper Potential Method is used to solve problem on a catering service selection. The comparison of result by using Potential method is made with Extent Analysis. The Potential Method is found to produce the same rank as Extent Analysis in AHP.
Intracellular applications of fluorescence correlation spectroscopy: prospects for neuroscience.
Kim, Sally A; Schwille, Petra
2003-10-01
Based on time-averaging fluctuation analysis of small fluorescent molecular ensembles in equilibrium, fluorescence correlation spectroscopy has recently been applied to investigate processes in the intracellular milieu. The exquisite sensitivity of fluorescence correlation spectroscopy provides access to a multitude of measurement parameters (rates of diffusion, local concentration, states of aggregation and molecular interactions) in real time with fast temporal and high spatial resolution. The introduction of dual-color cross-correlation, imaging, two-photon excitation, and coincidence analysis coupled with fluorescence correlation spectroscopy has expanded the utility of the technique to encompass a wide range of promising applications in living cells that may provide unprecedented insight into understanding the molecular mechanisms of intracellular neurobiological processes.
Using SFOC to fly the Magellan Venus mapping mission
NASA Technical Reports Server (NTRS)
Bucher, Allen W.; Leonard, Robert E., Jr.; Short, Owen G.
1993-01-01
Traditionally, spacecraft flight operations at the Jet Propulsion Laboratory (JPL) have been performed by teams of spacecraft experts utilizing ground software designed specifically for the current mission. The Jet Propulsion Laboratory set out to reduce the cost of spacecraft mission operations by designing ground data processing software that could be used by multiple spacecraft missions, either sequentially or concurrently. The Space Flight Operations Center (SFOC) System was developed to provide the ground data system capabilities needed to monitor several spacecraft simultaneously and provide enough flexibility to meet the specific needs of individual projects. The Magellan Spacecraft Team utilizes the SFOC hardware and software designed for engineering telemetry analysis, both real-time and non-real-time. The flexibility of the SFOC System has allowed the spacecraft team to integrate their own tools with SFOC tools to perform the tasks required to operate a spacecraft mission. This paper describes how the Magellan Spacecraft Team is utilizing the SFOC System in conjunction with their own software tools to perform the required tasks of spacecraft event monitoring as well as engineering data analysis and trending.
Economics of polysilicon processes
NASA Technical Reports Server (NTRS)
Yaws, C. L.; Li, K. Y.; Chou, S. M.
1986-01-01
Techniques are being developed to provide lower cost polysilicon material for solar cells. Existing technology which normally provides semiconductor industry polysilicon material is undergoing changes and also being used to provide polysilicon material for solar cells. Economics of new and existing technologies are presented for producing polysilicon. The economics are primarily based on the preliminary process design of a plant producing 1,000 metric tons/year of silicon. The polysilicon processes include: Siemen's process (hydrogen reduction of trichlorosilane); Union Carbide process (silane decomposition); and Hemlock Semiconductor process (hydrogen reduction of dichlorosilane). The economics include cost estimates of capital investment and product cost to produce polysilicon via the technology. Sensitivity analysis results are also presented to disclose the effect of major paramentes such as utilities, labor, raw materials and capital investment.
Synchronization and information processing by an on-off coupling
NASA Astrophysics Data System (ADS)
Wei, G. W.; Zhao, Shan
2002-05-01
This paper proposes an on-off coupling process for chaos synchronization and information processing. An in depth analysis for the net effect of a conventional coupling is performed. The stability of the process is studied. We show that the proposed controlled coupling process can locally minimize the smoothness and the fidelity of dynamical data. A digital filter expression for the on-off coupling process is derived and a connection is made to the Hanning filter. The utility and robustness of the proposed approach is demonstrated by chaos synchronization in Duffing oscillators, the spatiotemporal synchronization of noisy nonlinear oscillators, the estimation of the trend of a time series, and restoration of the contaminated solution of the nonlinear Schrödinger equation.
2013-01-01
Background Language comprehension requires decoding of complex, rapidly changing speech streams. Detecting changes of frequency modulation (FM) within speech is hypothesized as essential for accurate phoneme detection, and thus, for spoken word comprehension. Despite past demonstration of FM auditory evoked response (FMAER) utility in language disorder investigations, it is seldom utilized clinically. This report's purpose is to facilitate clinical use by explaining analytic pitfalls, demonstrating sites of cortical origin, and illustrating potential utility. Results FMAERs collected from children with language disorders, including Developmental Dysphasia, Landau-Kleffner syndrome (LKS), and autism spectrum disorder (ASD) and also normal controls - utilizing multi-channel reference-free recordings assisted by discrete source analysis - provided demonstratrions of cortical origin and examples of clinical utility. Recordings from inpatient epileptics with indwelling cortical electrodes provided direct assessment of FMAER origin. The FMAER is shown to normally arise from bilateral posterior superior temporal gyri and immediate temporal lobe surround. Childhood language disorders associated with prominent receptive deficits demonstrate absent left or bilateral FMAER temporal lobe responses. When receptive language is spared, the FMAER may remain present bilaterally. Analyses based upon mastoid or ear reference electrodes are shown to result in erroneous conclusions. Serial FMAER studies may dynamically track status of underlying language processing in LKS. FMAERs in ASD with language impairment may be normal or abnormal. Cortical FMAERs can locate language cortex when conventional cortical stimulation does not. Conclusion The FMAER measures the processing by the superior temporal gyri and adjacent cortex of rapid frequency modulation within an auditory stream. Clinical disorders associated with receptive deficits are shown to demonstrate absent left or bilateral responses. Serial FMAERs may be useful for tracking language change in LKS. Cortical FMAERs may augment invasive cortical language testing in epilepsy surgical patients. The FMAER may be normal in ASD and other language disorders when pathology spares the superior temporal gyrus and surround but presumably involves other brain regions. Ear/mastoid reference electrodes should be avoided and multichannel, reference free recordings utilized. Source analysis may assist in better understanding of complex FMAER findings. PMID:23351174
A CAD system and quality assurance protocol for bone age assessment utilizing digital hand atlas
NASA Astrophysics Data System (ADS)
Gertych, Arakadiusz; Zhang, Aifeng; Ferrara, Benjamin; Liu, Brent J.
2007-03-01
Determination of bone age assessment (BAA) in pediatric radiology is a task based on detailed analysis of patient's left hand X-ray. The current standard utilized in clinical practice relies on a subjective comparison of the hand with patterns in the book atlas. The computerized approach to BAA (CBAA) utilizes automatic analysis of the regions of interest in the hand image. This procedure is followed by extraction of quantitative features sensitive to skeletal development that are further converted to a bone age value utilizing knowledge from the digital hand atlas (DHA). This also allows providing BAA results resembling current clinical approach. All developed methodologies have been combined into one CAD module with a graphical user interface (GUI). CBAA can also improve the statistical and analytical accuracy based on a clinical work-flow analysis. For this purpose a quality assurance protocol (QAP) has been developed. Implementation of the QAP helped to make the CAD more robust and find images that cannot meet conditions required by DHA standards. Moreover, the entire CAD-DHA system may gain further benefits if clinical acquisition protocol is modified. The goal of this study is to present the performance improvement of the overall CAD-DHA system with QAP and the comparison of the CAD results with chronological age of 1390 normal subjects from the DHA. The CAD workstation can process images from local image database or from a PACS server.
The utilization of information technology in biomedicine
NASA Astrophysics Data System (ADS)
Isaev, E. A.; Tarasov, P. A.
2017-01-01
Biomedicine is a branch of medicine that studies the human body, its structure and function in health and disease, pathological condition, methods of diagnosis, treatment and correction [1]. At the moment, to solve their diverse problems associated with the collection, storage, and data analysis, process modeling, biomedicine extensively uses modern technical equipment. The goal of this article - to make a brief analysis of existing technologies (big data, mobile and cloud technologies), in terms of their applicability to the needs of biomedicine.
Software for visualization, analysis, and manipulation of laser scan images
NASA Astrophysics Data System (ADS)
Burnsides, Dennis B.
1997-03-01
The recent introduction of laser surface scanning to scientific applications presents a challenge to computer scientists and engineers. Full utilization of this two- dimensional (2-D) and three-dimensional (3-D) data requires advances in techniques and methods for data processing and visualization. This paper explores the development of software to support the visualization, analysis and manipulation of laser scan images. Specific examples presented are from on-going efforts at the Air Force Computerized Anthropometric Research and Design (CARD) Laboratory.
NASA Technical Reports Server (NTRS)
Buckley, D. H.
1983-01-01
Surface profilometry and scanning electron microscopy were utilized to study changes in the surface of polymers when eroded. The X-ray photoelectron spectroscopy (XPS) and depth profile analysis indicate the corrosion of metal and ceramic surfaces and reveal the diffusion of certain species into the surface to produce a change in mechanical properties. Ion implantation, nitriding and plating and their effects on the surface are characterized. Auger spectroscopy analysis identified morphological properties of coatings applied to surfaces by sputter deposition.
Multi-factor Analysis of Pre-control Fracture Simulations about Projectile Material
NASA Astrophysics Data System (ADS)
Wan, Ren-Yi; Zhou, Wei
2016-05-01
The study of projectile material pre-control fracture is helpful to improve the projectile metal effective fragmentation and the material utilization rate. Fragments muzzle velocity and lethality can be affected by the different explosive charge and the way of initiation. The finite element software can simulate the process of projectile explosive rupture which has a pre-groove in the projectile shell surface and analysis of typical node velocity change with time, to provides a reference for the design and optimization of precontrol frag.
Exploring the relationship between planning and procurement in Western U.S. electric utilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carvallo Bodelon, Juan Pablo; Sanstad, Alan H.; Larsen, Peter H.
Integrated resource planning (IRP) is an important regulatory process used in many U.S. states to formulate and evaluate least-cost and risk-assessed portfolios to meet future load requirements for electric utilities. In principle, effective implementation of IRP seeks to assure regulators and the public that utility investment decisions, given uncertainty, are as cost-effective as possible. However, to date, there is no empirical assessment on the effectiveness of IRP implementation. In this analysis, we compare planning, procurement processes and actual decisions for a sample of twelve load serving entities (LSEs) across the Western U. S. from 2003-2014. The 2008/2009 recession provides amore » unique “stress test” to the planning process and offers an excellent opportunity to trace how procurement decisions responded to this largely unforeseen event. In aggregate, there is a general alignment between planned and procured supply-side capacity. However, there are significant differences in the choice of supply-side resources and type of ownership for individual LSEs. We develop case studies for three LSEs and find that subsequent plans differ significantly due to changes in the planning environment and that procurement decisions in some cases are impacted by factors that are not accounted for in the planning process. Our results reveal that a limited amount of information produced during the long-term planning process (e.g., forecasts, methods, and least cost/risk portfolios) are ultimately used during the procurement process, and that the latter process relies extensively on the most recent information available for decision making. These findings suggest that states' IRP rules and regulations mandating long-term planning horizons with the same analytical complexity throughout the planning period may not create useful information for the procurement process. The social value of a long-term planning process that departs from procurement and the balance between transparency and complexity of the planning and procurement processes is an open question.« less
Brennan, Sharon L; Wluka, Anita E; Gould, Haslinda; Nicholson, Geoffrey C; Leslie, William D; Ebeling, Peter R; Oldenburg, Brian; Kotowicz, Mark A; Pasco, Julie A
2012-01-01
The World Health Organization identifies that osteoporosis is one of the leading health problems in the Western world. An increased risk of fragility fracture is observed in more socially disadvantaged individuals in most Western countries. Dual-energy X-ray absorptiometry (DXA) is currently the procedure of choice to diagnose osteoporosis and assess fracture risk. We systematically reviewed the literature regarding social determinants of DXA utilization for osteoporosis detection in patients aged 50yr and older using a computer-aided search of MEDLINE, EMBASE, CINAHL, and PsychINFO from January 1994 to December 2010. Five cross-sectional studies, incorporating 16 separate analyses, were identified for inclusion in this review. The best evidence analysis identified limited evidence for a positive association between either income or education with DXA utilization; furthermore, the best evidence analysis found no evidence for an association between either marital status or working status and DXA utilization. Further research is required to identify whether a relationship exists and elucidate reasons for disparities in DXA utilization between different social groups, such as choice and referral processes, as a necessary precursor in identifying modifiable determinants and appropriate strategies to promote preventive screening to identify fracture risk. Copyright © 2012 The International Society for Clinical Densitometry. Published by Elsevier Inc. All rights reserved.
What is in the flask? Going beyond inventories
NASA Astrophysics Data System (ADS)
Andres, R. J.; Patra, P. K.; Piper, S.
2010-12-01
Compiling accurate inventories is tough work. Spatial, temporal, and altitudinal constraints all impact inventory accuracy and utility. However, while there is considerable challenge in creating inventories, the creation process needs to be mindful of inventory utilization. No inventory is perfect for all needs, yet inventories can be constructed to meet many needs. This presentation focuses on the use of a global, monthly, fossil-fuel carbon dioxide inventory. This inventory serves as one input into an atmospheric general circulation model (AGCM) based chemistry-transport model (ACTM). The inquiry centers on if fossil fuel emissions significantly impact the seasonal cycle of measured atmospheric carbon dioxide concentrations. Model results will be compared to Scripps Institution of Oceanography (SIO) flask and continuous analyzer data. Primary metrics to be used in the comparison are slope and correlation analyses. Slope analysis will help assess the degree to which model results agree with SIO data. Correlation analysis will help assess the degree to which the various model components (i.e., fossil fuels, terrestrial biosphere, oceans) contribute to the overall seasonal cycle. The importance of this example is that it couples inventory creation with inventory utilization. This demonstration of a new inventory data set shows the utility of carefully crafted inventory data sets to the broader community.
Cram, Peter; Vijan, Sandeep; Wolbrink, Alex; Fendrick, A Mark
2003-01-01
Traditional cost-utility analysis assumes that all benefits from health-related interventions are captured by the quality-adjusted life-years (QALYs) gained by the few individuals whose outcome is improved by the intervention. However, it is possible that many individuals who do not directly benefit from an intervention receive utility, and therefore QALYs, because of the passive benefit (aka sense of security) provided by the existence of the intervention. The objective of this study was to evaluate the impact that varying quantities of passive benefit have on the cost-effectiveness of airline defibrillator programs. A decision analytic model with Markov processes was constructed to evaluate the cost-effectiveness of defibrillator deployment on domestic commercial passenger aircraft over 1 year. Airline passengers were assigned small incremental utility gains (.001-.01) during an estimated 3-hour flight to evaluate the impact of passive benefit on overall cost-effectiveness. In the base case analysis with no allowance for passive benefit, the cost-effectiveness of airline automated external defibrillator deployment was US dollars 34000 per QALY gained. If 1% of all passengers received utility gain of.01, the cost-effectiveness declined to US dollars 30000. Cost-effectiveness was enhanced when the quantity of passive benefit was raised or the percentage of individuals receiving passive benefit increased. Automated external defibrillator deployment on passenger aircraft is likely to be cost-effective. If a small percentage of airline passengers receive incremental utility gains from passive benefit of automated external defibrillator availability, the impact on overall cost-effectiveness may be substantial. Further research should attempt to clarify the magnitude and percentage of patients who receive passive benefit.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harben, P E; Harris, D; Myers, S
Seismic imaging and tracking methods have intelligence and monitoring applications. Current systems, however, do not adequately calibrate or model the unknown geological heterogeneity. Current systems are also not designed for rapid data acquisition and analysis in the field. This project seeks to build the core technological capabilities coupled with innovative deployment, processing, and analysis methodologies to allow seismic methods to be effectively utilized in the applications of seismic imaging and vehicle tracking where rapid (minutes to hours) and real-time analysis is required. The goal of this project is to build capabilities in acquisition system design, utilization and in full 3Dmore » finite difference modeling as well as statistical characterization of geological heterogeneity. Such capabilities coupled with a rapid field analysis methodology based on matched field processing are applied to problems associated with surveillance, battlefield management, finding hard and deeply buried targets, and portal monitoring. This project benefits the U.S. military and intelligence community in support of LLNL's national-security mission. FY03 was the final year of this project. In the 2.5 years this project has been active, numerous and varied developments and milestones have been accomplished. A wireless communication module for seismic data was developed to facilitate rapid seismic data acquisition and analysis. The E3D code was enhanced to include topographic effects. Codes were developed to implement the Karhunen-Loeve (K-L) statistical methodology for generating geological heterogeneity that can be utilized in E3D modeling. The matched field processing methodology applied to vehicle tracking and based on a field calibration to characterize geological heterogeneity was tested and successfully demonstrated in a tank tracking experiment at the Nevada Test Site. A 3-seismic-array vehicle tracking testbed was installed on-site at LLNL for testing real-time seismic tracking methods. A field experiment was conducted over a tunnel at the Nevada Site that quantified the tunnel reflection signal and, coupled with modeling, identified key needs and requirements in experimental layout of sensors. A large field experiment was conducted at the Lake Lynn Laboratory, a mine safety research facility in Pennsylvania, over a tunnel complex in realistic, difficult conditions. This experiment gathered the necessary data for a full 3D attempt to apply the methodology. The experiment also collected data to analyze the capabilities to detect and locate in-tunnel explosions for mine safety and other applications.« less
Overview of Sensitivity Analysis and Shape Optimization for Complex Aerodynamic Configurations
NASA Technical Reports Server (NTRS)
Newman, Perry A.; Newman, James C., III; Barnwell, Richard W.; Taylor, Arthur C., III; Hou, Gene J.-W.
1998-01-01
This paper presents a brief overview of some of the more recent advances in steady aerodynamic shape-design sensitivity analysis and optimization, based on advanced computational fluid dynamics. The focus here is on those methods particularly well- suited to the study of geometrically complex configurations and their potentially complex associated flow physics. When nonlinear state equations are considered in the optimization process, difficulties are found in the application of sensitivity analysis. Some techniques for circumventing such difficulties are currently being explored and are included here. Attention is directed to methods that utilize automatic differentiation to obtain aerodynamic sensitivity derivatives for both complex configurations and complex flow physics. Various examples of shape-design sensitivity analysis for unstructured-grid computational fluid dynamics algorithms are demonstrated for different formulations of the sensitivity equations. Finally, the use of advanced, unstructured-grid computational fluid dynamics in multidisciplinary analyses and multidisciplinary sensitivity analyses within future optimization processes is recommended and encouraged.
A Project Team Analysis Using Tuckman's Model of Small-Group Development.
Natvig, Deborah; Stark, Nancy L
2016-12-01
Concerns about equitable workloads for nursing faculty have been well documented, yet a standardized system for workload management does not exist. A project team was challenged to establish an academic workload management system when two dissimilar universities were consolidated. Tuckman's model of small-group development was used as the framework for the analysis of processes and effectiveness of a workload project team. Agendas, notes, and meeting minutes were used as the primary sources of information. Analysis revealed the challenges the team encountered. Utilization of a team charter was an effective tool in guiding the team to become a highly productive group. Lessons learned from the analysis are discussed. Guiding a diverse group into a highly productive team is complex. The use of Tuckman's model of small-group development provided a systematic mechanism to review and understand group processes and tasks. [J Nurs Educ. 2016;55(12):675-681.]. Copyright 2016, SLACK Incorporated.
Predictability and Prediction for an Experimental Cultural Market
NASA Astrophysics Data System (ADS)
Colbaugh, Richard; Glass, Kristin; Ormerod, Paul
Individuals are often influenced by the behavior of others, for instance because they wish to obtain the benefits of coordinated actions or infer otherwise inaccessible information. In such situations this social influence decreases the ex ante predictability of the ensuing social dynamics. We claim that, interestingly, these same social forces can increase the extent to which the outcome of a social process can be predicted very early in the process. This paper explores this claim through a theoretical and empirical analysis of the experimental music market described and analyzed in [1]. We propose a very simple model for this music market, assess the predictability of market outcomes through formal analysis of the model, and use insights derived through this analysis to develop algorithms for predicting market share winners, and their ultimate market shares, in the very early stages of the market. The utility of these predictive algorithms is illustrated through analysis of the experimental music market data sets [2].
NASA Technical Reports Server (NTRS)
Krist, Steven E.; Bauer, Steven X. S.
1999-01-01
The design process for developing the natural flow wing design on the HSR arrow wing configuration utilized several design tools and analysis methods. Initial fuselage/wing designs were generated with inviscid analysis and optimization methods in conjunction with the natural flow wing design philosophy. A number of designs were generated, satisfying different system constraints. Of the three natural flow wing designs developed, the NFWAc2 configuration is the design which satisfies the constraints utilized by McDonnell Douglas Aerospace (MDA) in developing a series of optimized configurations; a wind tunnel model of the MDA designed OPT5 configuration was constructed and tested. The present paper is concerned with the viscous analysis and inverse design of the arrow wing configurations, including the effects of the installed diverters/nacelles. Analyses were conducted with OVERFLOW, a Navier-Stokes flow solver for overset grids. Inverse designs were conducted with OVERDISC, which couples OVERFLOW with the CDISC inverse design method. An initial system of overset grids was generated for the OPT5 configuration with installed diverters/nacelles. An automated regridding process was then developed to use the OPT5 component grids to create grids for the natural flow wing designs. The inverse design process was initiated using the NFWAc2 configuration as a starting point, eventually culminating in the NFWAc4 design-for which a wind tunnel model was constructed. Due to the time constraints on the design effort, initial analyses and designs were conducted with a fairly coarse grid; subsequent analyses have been conducted on a refined system of grids. Comparisons of the computational results to experiment are provided at the end of this paper.
Baumgart, André; Denz, Christof; Bender, Hans-Joachim; Schleppers, Alexander
2009-01-01
The complexity of the operating room (OR) requires that both structural (eg, department layout) and behavioral (eg, staff interactions) patterns of work be considered when developing quality improvement strategies. In our study, we investigated how these contextual factors influence outpatient OR processes and the quality of care delivered. The study setting was a German university-affiliated hospital performing approximately 6000 outpatient surgeries annually. During the 3-year-study period, the hospital significantly changed its outpatient OR facility layout from a decentralized (ie, ORs in adjacent areas of the building) to a centralized (ie, ORs in immediate vicinity of each other) design. To study the impact of the facility change on OR processes, we used a mixed methods approach, including process analysis, process modeling, and social network analysis of staff interactions. The change in facility layout was seen to influence OR processes in ways that could substantially affect patient outcomes. For example, we found a potential for more errors during handovers in the new centralized design due to greater interdependency between tasks and staff. Utilization of the mixed methods approach in our analysis, as compared with that of a single assessment method, enabled a deeper understanding of the OR work context and its influence on outpatient OR processes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garcia, Humberto E.; Simpson, Michael F.; Lin, Wen-Chiao
In this paper, we apply an advanced safeguards approach and associated methods for process monitoring to a hypothetical nuclear material processing system. The assessment regarding the state of the processing facility is conducted at a systemcentric level formulated in a hybrid framework. This utilizes architecture for integrating both time- and event-driven data and analysis for decision making. While the time-driven layers of the proposed architecture encompass more traditional process monitoring methods based on time series data and analysis, the event-driven layers encompass operation monitoring methods based on discrete event data and analysis. By integrating process- and operation-related information and methodologiesmore » within a unified framework, the task of anomaly detection is greatly improved. This is because decision-making can benefit from not only known time-series relationships among measured signals but also from known event sequence relationships among generated events. This available knowledge at both time series and discrete event layers can then be effectively used to synthesize observation solutions that optimally balance sensor and data processing requirements. The application of the proposed approach is then implemented on an illustrative monitored system based on pyroprocessing and results are discussed.« less
Quantitative analysis of ribosome–mRNA complexes at different translation stages
Shirokikh, Nikolay E.; Alkalaeva, Elena Z.; Vassilenko, Konstantin S.; Afonina, Zhanna A.; Alekhina, Olga M.; Kisselev, Lev L.; Spirin, Alexander S.
2010-01-01
Inhibition of primer extension by ribosome–mRNA complexes (toeprinting) is a proven and powerful technique for studying mechanisms of mRNA translation. Here we have assayed an advanced toeprinting approach that employs fluorescently labeled DNA primers, followed by capillary electrophoresis utilizing standard instruments for sequencing and fragment analysis. We demonstrate that this improved technique is not merely fast and cost-effective, but also brings the primer extension inhibition method up to the next level. The electrophoretic pattern of the primer extension reaction can be characterized with a precision unattainable by the common toeprint analysis utilizing radioactive isotopes. This method allows us to detect and quantify stable ribosomal complexes at all stages of translation, including initiation, elongation and termination, generated during the complete translation process in both the in vitro reconstituted translation system and the cell lysate. We also point out the unique advantages of this new methodology, including the ability to assay sites of the ribosomal complex assembly on several mRNA species in the same reaction mixture. PMID:19910372
Analysis of supersonic plug nozzle flowfield and heat transfer
NASA Technical Reports Server (NTRS)
Murthy, S. N. B.; Sheu, W. H.
1988-01-01
A number of problems pertaining to the flowfield in a plug nozzle, designed as a supersonic thruster nozzle, with provision for cooling the plug with a coolant stream admitted parallel to the plug wall surface, were studied. First, an analysis was performed of the inviscid, nonturbulent, gas dynamic interaction between the primary hot stream and the secondary coolant stream. A numerical prediction code for establishing the resulting flowfield with a dividing surface between the two streams, for various combinations of stagnation and static properties of the two streams, was utilized for illustrating the nature of interactions. Secondly, skin friction coefficient, heat transfer coefficient and heat flux to the plug wall were analyzed under smooth flow conditions (without shocks or separation) for various coolant flow conditions. A numerical code was suitably modified and utilized for the determination of heat transfer parameters in a number of cases for which data are available. Thirdly, an analysis was initiated for modeling turbulence processes in transonic shock-boundary layer interaction without the appearance of flow separation.
NASA Astrophysics Data System (ADS)
Hayata, K.; Yanagawa, K.; Koshiba, M.
1990-12-01
A mode field analysis is presented of the second-harmonic electromagnetic wave that radiates from a nonlinear core bounded by a dielectric cladding. With this analysis the ultimate performance of the organic crystal-cored single-mode optical fiber waveguide as a guided-wave frequency doubler is evaluated through the solution of nonlinear parametric equations derived from Maxwell's equations under some assumptions. As a phase-matching scheme, a Cerenkov approach is considered because of advantages in actual device applications, in which the phase matching is achievable between the fundamental guided LP01 mode and the second-harmonic radiation (leaky) mode. Calculated results for organic cores made of benzil, 4-(N,N-dimethyl-amino)-3-acetamidonitrobenzen, 2-methyl-4-nitroaniline, and 4'-nitrobenzilidene-3-acetoamino-4-metxianiline provide useful data for designing an efficient fiber-optic wavelength converter utilizing nonlinear parametric processes. A detailed comparison is made between results for infinite and finite cladding thicknesses.
Kaipio, Johanna; Stenhammar, Hanna; Immonen, Susanna; Litovuo, Lauri; Axelsson, Minja; Lantto, Minna; Lahdenne, Pekka
2018-01-01
Patient feedback is considered important for healthcare organizations. However, measurement and analysis of patient reported data is useful only if gathered insights are transformed into actions. This article focuses on gathering and utilization of patient experience data at hospitals with the aim of supporting the development of patient-centered services. The study was designed to explore both current practices of collecting and utilizing patient feedback at hospitals as well as future feedback-related opportunities. Nine people working at different hierarchical levels of three university hospitals in Finland participated in in-depth interviews. Findings indicate that current feedback processes are poorly planned and inflexible. Some feedback data are gathered, but not systematically utilized. Currently, it is difficult to obtain a comprehensive picture of the situation. One future hope was to increase the amount of patient feedback to be able to better generalize and utilize the data. Based on the findings the following recommendations are given: attention to both patients' and healthcare staff's perspectives when collecting feedback, employing a coordinated approach for collecting and utilizing patient feedback, and organizational transformation towards a patient-centric culture.
Pedron, Sara; Winter, Vera; Oppel, Eva-Maria; Bialas, Enno
2017-08-23
Operating room (OR) efficiency continues to be a high priority for hospitals. In this context the concept of benchmarking has gained increasing importance as a means to improve OR performance. The aim of this study was to investigate whether and how participation in a benchmarking and reporting program for surgical process data was associated with a change in OR efficiency, measured through raw utilization, turnover times, and first-case tardiness. The main analysis is based on panel data from 202 surgical departments in German hospitals, which were derived from the largest database for surgical process data in Germany. Panel regression modelling was applied. Results revealed no clear and univocal trend of participation in a benchmarking and reporting program for surgical process data. The largest trend was observed for first-case tardiness. In contrast to expectations, turnover times showed a generally increasing trend during participation. For raw utilization no clear and statistically significant trend could be evidenced. Subgroup analyses revealed differences in effects across different hospital types and department specialties. Participation in a benchmarking and reporting program and thus the availability of reliable, timely and detailed analysis tools to support the OR management seemed to be correlated especially with an increase in the timeliness of staff members regarding first-case starts. The increasing trend in turnover time revealed the absence of effective strategies to improve this aspect of OR efficiency in German hospitals and could have meaningful consequences for the medium- and long-run capacity planning in the OR.
NASA Astrophysics Data System (ADS)
Mateos-Espejel, Enrique
The objective of this thesis is to develop, validate, and apply a unified methodology for the energy efficiency improvement of a Kraft process that addresses globally the interactions of the various process systems that affect its energy performance. An implementation strategy is the final result. An operating Kraft pulping mill situated in Eastern Canada with a production of 700 adt/d of high-grade bleached pulp was the case study. The Pulp and Paper industry is Canada's premier industry. It is characterized by large thermal energy and water consumption. Rising energy costs and more stringent environmental regulations have led the industry to refocus its efforts toward identifying ways to improve energy and water conservation. Energy and water aspects are usually analyzed independently, but in reality they are strongly interconnected. Therefore, there is a need for an integrated methodology, which considers energy and water aspects, as well as the optimal utilization and production of the utilities. The methodology consists of four successive stages. The first stage is the base case definition. The development of a focused, reliable and representative model of an operating process is a prerequisite to the optimization and fine tuning of its energy performance. A four-pronged procedure has been developed: data gathering, master diagram, utilities systems analysis, and simulation. The computer simulation has been focused on the energy and water systems. The second stage corresponds to the benchmarking analysis. The benchmarking of the base case has the objectives of identifying the process inefficiencies and to establish guidelines for the development of effective enhancement measures. The studied process is evaluated by a comparison of its efficiency to the current practice of the industry and by the application of new energy and exergy content indicators. The minimum energy and water requirements of the process are also determined in this step. The third stage is the core of the methodology; it represents the formulation of technically feasible energy enhancing options. Several techniques are applied in an iterative procedure to cast light on their synergies and counter-actions. The objective is to develop a path for improving the process so as to maximize steam savings while minimizing the investment required. The fourth stage is the implementation strategy. As the existing process configuration and operating conditions vary from process to process it is important to develop a strategy for the implementation of energy enhancement programs in the most advantageous way for each case. A three-phase strategy was selected for the specific case study in the context of its management strategic plan: the elimination of fossil fuel, the production of power and the liberation of steam capacity. A post-benchmarking analysis is done to quantify the improvement of the energy efficiency. The performance indicators are computed after all energy enhancing measures have been implemented. The improvement of the process by applying the unified methodology results in substantially more steam savings than by applying individually the typical techniques that it comprises: energy savings of 5.6 GJ/adt (27% of the current requirement), water savings of 32 m3/adt (34% of the current requirement) and an electricity production potential of 44.5MW. As a result of applying the unified methodology the process becomes eco-friendly as it does not require fossil fuel for producing steam; its water and steam consumptions are below the Canadian average and it produces large revenues from the production of green electricity.
NASA Astrophysics Data System (ADS)
Cross, M.
2016-12-01
An improved process for the identification of tree types from satellite imagery for tropical forests is needed for more accurate assessments of the impact of forests on the global climate. La Selva Biological Station in Costa Rica was the tropical forest area selected for this particular study. WorldView-3 imagery was utilized because of its high spatial, spectral and radiometric resolution, its availability, and its potential to differentiate species in a complex forest setting. The first-step was to establish confidence in the high spatial and high radiometric resolution imagery from WorldView-3 in delineating tree types within a complex forest setting. In achieving this goal, ASD field spectrometer data were collected of specific tree species to establish solid ground control within the study site. The spectrometer data were collected from the top of each specific tree canopy utilizing established towers located at La Selva Biological Station so as to match the near-nadir view of the WorldView-3 imagery. The ASD data was processed utilizing the spectral response functions for each of the WorldView-3 bands to convert the ASD data into a band specific reflectivity. This allowed direct comparison of the ASD spectrometer reflectance data to the WorldView-3 multispectral imagery. The WorldView-3 imagery was processed to surface reflectance using two standard atmospheric correction procedures and the proprietary DigitalGlobe Atmospheric Compensation (AComp) product. The most accurate correction process was identified through comparison to the spectrometer data collected. A series of statistical measures were then utilized to access the accuracy of the processed imagery and which imagery bands are best suited for tree type identification. From this analysis, a segmentation/classification process was performed to identify individual tree type locations within the study area. It is envisioned the results of this study will improve traditional forest classification processes, provide more accurate assessments of species density and distribution, facilitate a more accurate biomass estimate of the tropical forest which will impact the accuracy of tree carbon storage estimates, and ultimately assist in developing a better overall characterization of tropical rainforest dynamics.
NASA Supportability Engineering Implementation Utilizing DoD Practices and Processes
NASA Technical Reports Server (NTRS)
Smith, David A.; Smith, John V.
2010-01-01
The Ares I design and development program made the determination early in the System Design Review Phase to utilize DoD ILS and LSA approach for supportability engineering as an integral part of the system engineering process. This paper is to provide a review of the overall approach to design Ares-I with an emphasis on a more affordable, supportable, and sustainable launch vehicle. Discussions will include the requirements development, design influence, support concept alternatives, ILS and LSA planning, Logistics support analyses/trades performed, LSA tailoring for NASA Ares Program, support system infrastructure identification, ILS Design Review documentation, Working Group coordination, and overall ILS implementation. At the outset, the Ares I Project initiated the development of the Integrated Logistics Support Plan (ILSP) and a Logistics Support Analysis process to provide a path forward for the management of the Ares-I ILS program and supportability analysis activities. The ILSP provide the initial planning and coordination between the Ares-I Project Elements and Ground Operation Project. The LSA process provided a system engineering approach in the development of the Ares-I supportability requirements; influence the design for supportability and development of alternative support concepts that satisfies the program operability requirements. The LSA planning and analysis results are documented in the Logistics Support Analysis Report. This document was required during the Ares-I System Design Review (SDR) and Preliminary Design Review (PDR) review cycles. To help coordinate the LSA process across the Ares-I project and between programs, the LSA Report is updated and released quarterly. A System Requirement Analysis was performed to determine the supportability requirements and technical performance measurements (TPMs). Two working groups were established to provide support in the management and implement the Ares-I ILS program, the Integrated Logistics Support Working Group (ILSWG) and the Logistics Support Analysis Record Working Group (LSARWG). The Ares I ILSWG is established to assess the requirements and conduct, evaluate analyses and trade studies associated with acquisition logistic and supportability processes and to resolve Ares I integrated logistics and supportability issues. It established a strategic collaborative alliance for coordination of Logistics Support Analysis activates in support of the integrated Ares I vehicle design and development of logistics support infrastructure. A Joint Ares I - Orion LSAR Working Group was established to: 1) Guide the development of Ares-I and Orion LSAR data and serve as a model for future Constellation programs, 2) Develop rules and assumptions that will apply across the Constellation program with regards to the program's LSAR development, and 3) Maintain the Constellation LSAR Style Guide.
Assessment of MSFCs Process for the Development and Activation of Space Act Agreements
NASA Technical Reports Server (NTRS)
Daugherty, Rachel A.
2014-01-01
A Space Act Agreement (SAA) is a contractual vehicle that NASA utilizes to form partnerships with non-NASA entities to stimulate cutting-edge innovation within the science and technology communities while concurrently supporting the NASA missions. SAAs are similar to traditional contracts in that they involve the commitment of Agency resources but allow more flexibility and are more cost effective to implement than traditional contracts. Consequently, the use of SAAs to develop partnerships has greatly increased over the past several years. To facilitate this influx of SAAs, Marshall Space Flight Center (MSFC) developed a process during a kaizen event to streamline and improve the quality of SAAs developed at the Center level. This study assessed the current SAA process to determine if improvements could be implemented to increase productivity, decrease time to activation, and improve the quality of deliverables. Using a combination of direct procedural observation, personnel interviews, and statistical analysis, elements of the process in need of remediation were identified and potential solutions developed. The findings focus primarily on the difficulties surrounding tracking and enforcing process adherence and communication issues among stakeholders. Potential solutions include utilizing customer relationship management (CRM) software to facilitate process coordination and co-locating or potentially merging the two separate organizations involved in SAA development and activation at MSFC.
Steginga, Suzanne K; Occhipinti, Stefano
2004-01-01
The study investigated the utility of the Heuristic-Systematic Processing Model as a framework for the investigation of patient decision making. A total of 111 men recently diagnosed with localized prostate cancer were assessed using Verbal Protocol Analysis and self-report measures. Study variables included men's use of nonsystematic and systematic information processing, desire for involvement in decision making, and the individual differences of health locus of control, tolerance of ambiguity, and decision-related uncertainty. Most men (68%) preferred that decision making be shared equally between them and their doctor. Men's use of the expert opinion heuristic was related to men's verbal reports of decisional uncertainty and having a positive orientation to their doctor and medical care; a desire for greater involvement in decision making was predicted by a high internal locus of health control. Trends were observed for systematic information processing to increase when the heuristic strategy used was negatively affect laden and when men were uncertain about the probabilities for cure and side effects. There was a trend for decreased systematic processing when the expert opinion heuristic was used. Findings were consistent with the Heuristic-Systematic Processing Model and suggest that this model has utility for future research in applied decision making about health.
Hughey, Justin R; Keen, Justin M; Brough, Chris; Saeger, Sophie; McGinity, James W
2011-10-31
Poorly water-soluble drug substances that exhibit high melting points are often difficult to successfully process by fusion-based techniques. The purpose of this study was to identify a suitable polymer system for meloxicam (MLX), a high melting point class II BCS compound, and investigate thermal processing techniques for the preparation of chemically stable single phase solid dispersions. Thermal and solution based screening techniques were utilized to screen hydrophilic polymers suitable for immediate release formulations. Results of the screening studies demonstrated that Soluplus(®)(SOL) provided the highest degree of miscibility and solubility enhancement. A hot-melt extrusion feasibility study demonstrated that high temperatures and extended residence times were required in order to render compositions amorphous, causing significant degradation of MLX. A design of experiments (DOE) was conducted on the KinetiSol(®) Dispersing (KSD) process to evaluate the effect of processing conditions on the chemical stability and amorphous character of MLX. The study demonstrated that ejection temperature significantly impacted MLX stability. All samples prepared by KSD were substantially amorphous. Dissolution analysis of the KSD processed solid dispersions showed increased dissolution rates and extent of supersaturation over the marketed generic MLX tablets. Copyright © 2011 Elsevier B.V. All rights reserved.
Study of Dynamic Characteristics of Aeroelastic Systems Utilizing Randomdec Signatures
NASA Technical Reports Server (NTRS)
Chang, C. S.
1975-01-01
The feasibility of utilizing the random decrement method in conjunction with a signature analysis procedure to determine the dynamic characteristics of an aeroelastic system for the purpose of on-line prediction of potential on-set of flutter was examined. Digital computer programs were developed to simulate sampled response signals of a two-mode aeroelastic system. Simulated response data were used to test the random decrement method. A special curve-fit approach was developed for analyzing the resulting signatures. A number of numerical 'experiments' were conducted on the combined processes. The method is capable of determining frequency and damping values accurately from randomdec signatures of carefully selected lengths.
Modeling of non-uniform spatial arrangement of fibers in a ceramic matrix composite
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, S.; Tewari, A.; Gokhale, A.M.
In the unidirectional fiber reinforced composites, the spatial agreement of fibers is often non-uniform. These non-uniformities are linked to the processing conditions, and they affect the properties of the composite. In this contribution, a recently developed digital image analysis technique is used to quantify the non-uniform spatial arrangement of Nicalon fibers in a ceramic matrix composite (CMC). These quantitative data are utilized to develop a six parameter computer simulated microstructure model that is statistically equivalent to the non-uniform microstructure of the CMC. The simulated microstructure can be utilized as a RVE for the micro-mechanical modeling studies.
Kim, Jaeik; Chey, Jeanyung; Kim, Sang-Eun; Kim, Hoyoung
2015-05-01
Education involves learning new information and acquiring cognitive skills. These require various cognitive processes including learning, memory, and language. Since cognitive processes activate associated brain areas, we proposed that the brains of elderly people with longer education periods would show traces of repeated activation as increased synaptic connectivity and capillary in brain areas involved in learning, memory, and language. Utilizing positron emission topography (PET), this study examined the effect of education in the human brain utilizing the regional cerebral glucose metabolism rates (rCMRglcs). 26 elderly women with high-level education (HEG) and 26 with low-level education (LEG) were compared with regard to their regional brain activation and association between the regions. Further, graphical theoretical analysis using rCMRglcs was applied to examine differences in the functional network properties of the brain. The results showed that the HEG had higher rCMRglc in the ventral cerebral regions that are mainly involved in memory, language, and neurogenesis, while the LEG had higher rCMRglc in apical areas of the cerebrum mainly involved in motor and somatosensory functions. Functional connectivity investigated with graph theoretical analysis illustrated that the brain of the HEG compared to those of the LEG were overall more efficient, more resilient, and characterized by small-worldness. This may be one of the brain's mechanisms mediating the reserve effects found in people with higher education. Copyright © 2014 Elsevier Ireland Ltd and the Japan Neuroscience Society. All rights reserved.
Selecting essential information for biosurveillance--a multi-criteria decision analysis.
Generous, Nicholas; Margevicius, Kristen J; Taylor-McCabe, Kirsten J; Brown, Mac; Daniel, W Brent; Castro, Lauren; Hengartner, Andrea; Deshpande, Alina
2014-01-01
The National Strategy for Biosurveillance defines biosurveillance as "the process of gathering, integrating, interpreting, and communicating essential information related to all-hazards threats or disease activity affecting human, animal, or plant health to achieve early detection and warning, contribute to overall situational awareness of the health aspects of an incident, and to enable better decision-making at all levels." However, the strategy does not specify how "essential information" is to be identified and integrated into the current biosurveillance enterprise, or what the metrics qualify information as being "essential". The question of data stream identification and selection requires a structured methodology that can systematically evaluate the tradeoffs between the many criteria that need to be taken in account. Multi-Attribute Utility Theory, a type of multi-criteria decision analysis, can provide a well-defined, structured approach that can offer solutions to this problem. While the use of Multi-Attribute Utility Theoryas a practical method to apply formal scientific decision theoretical approaches to complex, multi-criteria problems has been demonstrated in a variety of fields, this method has never been applied to decision support in biosurveillance.We have developed a formalized decision support analytic framework that can facilitate identification of "essential information" for use in biosurveillance systems or processes and we offer this framework to the global BSV community as a tool for optimizing the BSV enterprise. To demonstrate utility, we applied the framework to the problem of evaluating data streams for use in an integrated global infectious disease surveillance system.
Suner, Aslı; Oruc, Ozlem Ege; Buke, Cagri; Ozkaya, Hacer Deniz; Kitapcioglu, Gul
2017-08-31
Hand hygiene is one of the most effective attempts to control nosocomial infections, and it is an important measure to avoid the transmission of pathogens. However, the compliance of healthcare workers (HCWs) with hand washing is still poor worldwide. Herein, we aimed to determine the best hand hygiene preference of the infectious diseases and clinical microbiology (IDCM) specialists to prevent transmission of microorganisms from one patient to another. Expert opinions regarding the criteria that influence the best hand hygiene preference were collected through a questionnaire via face-to-face interviews. Afterwards, these opinions were examined with two widely used multi-criteria decision analysis (MCDA) methods, the Multi-Attribute Utility Theory (MAUT) and the Analytic Hierarchy Process (AHP). A total of 15 IDCM specialist opinions were collected from diverse private and public hospitals located in İzmir, Turkey. The mean age of the participants was 49.73 ± 8.46, and the mean experience year of the participants in their fields was 17.67 ± 11.98. The findings that we obtained through two distinct decision making methods, the MAUT and the AHP, suggest that alcohol-based antiseptic solution (ABAS) has the highest utility (0.86) and priority (0.69) among the experts' choices. In conclusion, the MAUT and the AHP, decision models developed here indicate that rubbing the hands with ABAS is the most favorable choice for IDCM specialists to prevent nosocomial infection.
Applications of Automation Methods for Nonlinear Fracture Test Analysis
NASA Technical Reports Server (NTRS)
Allen, Phillip A.; Wells, Douglas N.
2013-01-01
As fracture mechanics material testing evolves, the governing test standards continue to be refined to better reflect the latest understanding of the physics of the fracture processes involved. The traditional format of ASTM fracture testing standards, utilizing equations expressed directly in the text of the standard to assess the experimental result, is self-limiting in the complexity that can be reasonably captured. The use of automated analysis techniques to draw upon a rich, detailed solution database for assessing fracture mechanics tests provides a foundation for a new approach to testing standards that enables routine users to obtain highly reliable assessments of tests involving complex, non-linear fracture behavior. Herein, the case for automating the analysis of tests of surface cracks in tension in the elastic-plastic regime is utilized as an example of how such a database can be generated and implemented for use in the ASTM standards framework. The presented approach forms a bridge between the equation-based fracture testing standards of today and the next generation of standards solving complex problems through analysis automation.
A decision-support system for the analysis of clinical practice patterns.
Balas, E A; Li, Z R; Mitchell, J A; Spencer, D C; Brent, E; Ewigman, B G
1994-01-01
Several studies documented substantial variation in medical practice patterns, but physicians often do not have adequate information on the cumulative clinical and financial effects of their decisions. The purpose of developing an expert system for the analysis of clinical practice patterns was to assist providers in analyzing and improving the process and outcome of patient care. The developed QFES (Quality Feedback Expert System) helps users in the definition and evaluation of measurable quality improvement objectives. Based on objectives and actual clinical data, several measures can be calculated (utilization of procedures, annualized cost effect of using a particular procedure, and expected utilization based on peer-comparison and case-mix adjustment). The quality management rules help to detect important discrepancies among members of the selected provider group and compare performance with objectives. The system incorporates a variety of data and knowledge bases: (i) clinical data on actual practice patterns, (ii) frames of quality parameters derived from clinical practice guidelines, and (iii) rules of quality management for data analysis. An analysis of practice patterns of 12 family physicians in the management of urinary tract infections illustrates the use of the system.
Non-Toxic Gold Nanoclusters for Solution-Processed White Light-Emitting Diodes.
Chao, Yu-Chiang; Cheng, Kai-Ping; Lin, Ching-Yi; Chang, Yu-Li; Ko, Yi-Yun; Hou, Tzu-Yin; Huang, Cheng-Yi; Chang, Walter H; Lin, Cheng-An J
2018-06-11
Solution-processed optoelectronic devices are attractive because of the potential low-cost fabrication and the compatibility with flexible substrate. However, the utilization of toxic elements such as lead and cadmium in current optoelectronic devices on the basis of colloidal quantum dots raises environmental concerns. Here we demonstrate that white-light-emitting diodes can be achieved by utilizing non-toxic and environment-friendly gold nanoclusters. Yellow-light-emitting gold nanoclusters were synthesized and capped with trioctylphosphine. These gold nanoclusters were then blended with the blue-light-emitting organic host materials to form the emissive layer. A current efficiency of 0.13 cd/A was achieved. The Commission Internationale de l'Eclairage chromaticity coordinates of (0.27, 0.33) were obtained from our experimental analysis, which is quite close to the ideal pure white emission coordinates (0.33, 0.33). Potential applications include innovative lighting devices and monitor backlight.
Use of laser range finders and range image analysis in automated assembly tasks
NASA Technical Reports Server (NTRS)
Alvertos, Nicolas; Dcunha, Ivan
1990-01-01
A proposition to study the effect of filtering processes on range images and to evaluate the performance of two different laser range mappers is made. Median filtering was utilized to remove noise from the range images. First and second order derivatives are then utilized to locate the similarities and dissimilarities between the processed and the original images. Range depth information is converted into spatial coordinates, and a set of coefficients which describe 3-D objects is generated using the algorithm developed in the second phase of this research. Range images of spheres and cylinders are used for experimental purposes. An algorithm was developed to compare the performance of two different laser range mappers based upon the range depth information of surfaces generated by each of the mappers. Furthermore, an approach based on 2-D analytic geometry is also proposed which serves as a basis for the recognition of regular 3-D geometric objects.
Complete hazard ranking to analyze right-censored data: An ALS survival study.
Huang, Zhengnan; Zhang, Hongjiu; Boss, Jonathan; Goutman, Stephen A; Mukherjee, Bhramar; Dinov, Ivo D; Guan, Yuanfang
2017-12-01
Survival analysis represents an important outcome measure in clinical research and clinical trials; further, survival ranking may offer additional advantages in clinical trials. In this study, we developed GuanRank, a non-parametric ranking-based technique to transform patients' survival data into a linear space of hazard ranks. The transformation enables the utilization of machine learning base-learners including Gaussian process regression, Lasso, and random forest on survival data. The method was submitted to the DREAM Amyotrophic Lateral Sclerosis (ALS) Stratification Challenge. Ranked first place, the model gave more accurate ranking predictions on the PRO-ACT ALS dataset in comparison to Cox proportional hazard model. By utilizing right-censored data in its training process, the method demonstrated its state-of-the-art predictive power in ALS survival ranking. Its feature selection identified multiple important factors, some of which conflicts with previous studies.
Yasui, Yutaka; McLerran, Dale; Adam, Bao-Ling; Winget, Marcy; Thornquist, Mark; Feng, Ziding
2003-01-01
Discovery of "signature" protein profiles that distinguish disease states (eg, malignant, benign, and normal) is a key step towards translating recent advancements in proteomic technologies into clinical utilities. Protein data generated from mass spectrometers are, however, large in size and have complex features due to complexities in both biological specimens and interfering biochemical/physical processes of the measurement procedure. Making sense out of such high-dimensional complex data is challenging and necessitates the use of a systematic data analytic strategy. We propose here a data processing strategy for two major issues in the analysis of such mass-spectrometry-generated proteomic data: (1) separation of protein "signals" from background "noise" in protein intensity measurements and (2) calibration of protein mass/charge measurements across samples. We illustrate the two issues and the utility of the proposed strategy using data from a prostate cancer biomarker discovery project as an example.