Gonzalez, John; Trickett, Edison J.
2014-01-01
This paper describes the processes we engaged in to develop a measurement protocol used to assess the outcomes in a community based suicide and alcohol abuse prevention project with two Alaska Native communities. While the literature on community-based participatory research (CBPR) is substantial regarding the importance of collaborations, few studies have reported on this collaboration in the process of developing measures to assess CBPR projects. We first tell a story of the processes around the standard issues of doing cross-cultural work on measurement development related to areas of equivalence. A second story is provided that highlights how community differences within the same cultural group can affect both the process and content of culturally relevant measurement selection, adaptation, and development. PMID:24748283
Gonzalez, John; Trickett, Edison J
2014-09-01
This paper describes the processes we engaged into develop a measurement protocol used to assess the outcomes in a community based suicide and alcohol abuse prevention project with two Alaska Native communities. While the literature on community-based participatory research (CBPR) is substantial regarding the importance of collaborations, few studies have reported on this collaboration in the process of developing measures to assess CBPR projects. We first tell a story of the processes around the standard issues of doing cross-cultural work on measurement development related to areas of equivalence. A second story is provided that highlights how community differences within the same cultural group can affect both the process and content of culturally relevant measurement selection, adaptation, and development.
A method for developing outcome measures in the clinical laboratory.
Jones, J
1996-01-01
Measuring and reporting outcomes in health care is becoming more important for quality assessment, utilization assessment, accreditation standards, and negotiating contracts in managed care. How does one develop an outcome measure for the laboratory to assess the value of the services? A method is described which outlines seven steps in developing outcome measures for a laboratory service or process. These steps include the following: 1. Identify the process or service to be monitored for performance and outcome assessment. 2. If necessary, form an multidisciplinary team of laboratory staff, other department staff, physicians, and pathologists. 3. State the purpose of the test or service including a review of published data for the clinical pathological correlation. 4. Prepare a process cause and effect diagram including steps critical to the outcome. 5. Identify key process variables that contribute to positive or negative outcomes. 6. Identify outcome measures that are not process measures. 7. Develop an operational definition, identify data sources, and collect data. Examples, including a process cause and effect diagram, process variables, and outcome measures, are given using the Therapeutic Drug Monitoring service (TDM). A summary of conclusions and precautions for outcome measurement is then provided.
Lau, Nathan; Jamieson, Greg A; Skraaning, Gyrd
2016-03-01
The Process Overview Measure is a query-based measure developed to assess operator situation awareness (SA) from monitoring process plants. A companion paper describes how the measure has been developed according to process plant properties and operator cognitive work. The Process Overview Measure demonstrated practicality, sensitivity, validity and reliability in two full-scope simulator experiments investigating dramatically different operational concepts. Practicality was assessed based on qualitative feedback of participants and researchers. The Process Overview Measure demonstrated sensitivity and validity by revealing significant effects of experimental manipulations that corroborated with other empirical results. The measure also demonstrated adequate inter-rater reliability and practicality for measuring SA in full-scope simulator settings based on data collected on process experts. Thus, full-scope simulator studies can employ the Process Overview Measure to reveal the impact of new control room technology and operational concepts on monitoring process plants. Practitioner Summary: The Process Overview Measure is a query-based measure that demonstrated practicality, sensitivity, validity and reliability for assessing operator situation awareness (SA) from monitoring process plants in representative settings.
Measuring the software process and product: Lessons learned in the SEL
NASA Technical Reports Server (NTRS)
Basili, V. R.
1985-01-01
The software development process and product can and should be measured. The software measurement process at the Software Engineering Laboratory (SEL) has taught a major lesson: develop a goal-driven paradigm (also characterized as a goal/question/metric paradigm) for data collection. Project analysis under this paradigm leads to a design for evaluating and improving the methodology of software development and maintenance.
The evolution and development of an instrument to measure essential professional nursing practices.
Kramer, Marlene; Brewer, Barbara B; Halfer, Diana; Hnatiuk, Cynthia Nowicki; MacPhee, Maura; Schmalenberg, Claudia
2014-11-01
Nursing continues to evolve from a task-oriented occupation to a holistic professional practice. Increased professionalism requires accurate measurement of care processes and practice. Nursing studies often omit measurement of the relationship between structures in the work environment and processes of care or between processes of care and patient outcomes. Process measurement is integral to understanding and improving nursing practice. This article describes the development of an updated Essentials of Magnetism process measurement instrument for clinical nurses (CNs) practicing on inpatient units in hospitals. It has been renamed Essential Professional Nursing Practices: CN.
Finch, Tracy L; Mair, Frances S; O'Donnell, Catherine; Murray, Elizabeth; May, Carl R
2012-05-17
Although empirical and theoretical understanding of processes of implementation in health care is advancing, translation of theory into structured measures that capture the complex interplay between interventions, individuals and context remain limited. This paper aimed to (1) describe the process and outcome of a project to develop a theory-based instrument for measuring implementation processes relating to e-health interventions; and (2) identify key issues and methodological challenges for advancing work in this field. A 30-item instrument (Technology Adoption Readiness Scale (TARS)) for measuring normalisation processes in the context of e-health service interventions was developed on the basis on Normalization Process Theory (NPT). NPT focuses on how new practices become routinely embedded within social contexts. The instrument was pre-tested in two health care settings in which e-health (electronic facilitation of healthcare decision-making and practice) was used by health care professionals. The developed instrument was pre-tested in two professional samples (N=46; N=231). Ratings of items representing normalisation 'processes' were significantly related to staff members' perceptions of whether or not e-health had become 'routine'. Key methodological challenges are discussed in relation to: translating multi-component theoretical constructs into simple questions; developing and choosing appropriate outcome measures; conducting multiple-stakeholder assessments; instrument and question framing; and more general issues for instrument development in practice contexts. To develop theory-derived measures of implementation process for progressing research in this field, four key recommendations are made relating to (1) greater attention to underlying theoretical assumptions and extent of translation work required; (2) the need for appropriate but flexible approaches to outcomes measurement; (3) representation of multiple perspectives and collaborative nature of work; and (4) emphasis on generic measurement approaches that can be flexibly tailored to particular contexts of study.
ERIC Educational Resources Information Center
Lee, Il-Sun; Byeon, Jung-Ho; Kim, Young-shin; Kwon, Yong-Ju
2014-01-01
The purpose of this study was to develop a model for measuring experimental design ability based on functional magnetic resonance imaging (fMRI) during biological inquiry. More specifically, the researchers developed an experimental design task that measures experimental design ability. Using the developed experimental design task, they measured…
Kahn, Jeremy M; Gould, Michael K; Krishnan, Jerry A; Wilson, Kevin C; Au, David H; Cooke, Colin R; Douglas, Ivor S; Feemster, Laura C; Mularski, Richard A; Slatore, Christopher G; Wiener, Renda Soylemez
2014-05-01
Many health care performance measures are either not based on high-quality clinical evidence or not tightly linked to patient-centered outcomes, limiting their usefulness in quality improvement. In this report we summarize the proceedings of an American Thoracic Society workshop convened to address this problem by reviewing current approaches to performance measure development and creating a framework for developing high-quality performance measures by basing them directly on recommendations from well-constructed clinical practice guidelines. Workshop participants concluded that ideally performance measures addressing care processes should be linked to clinical practice guidelines that explicitly rate the quality of evidence and the strength of recommendations, such as the Grading of Recommendations Assessment, Development, and Evaluation (GRADE) process. Under this framework, process-based performance measures would only be developed from strong recommendations based on high- or moderate-quality evidence. This approach would help ensure that clinical processes specified in performance measures are both of clear benefit to patients and supported by strong evidence. Although this approach may result in fewer performance measures, it would substantially increase the likelihood that quality-improvement programs based on these measures actually improve patient care.
Cepeda, Nicholas J.; Blackwell, Katharine A.; Munakata, Yuko
2012-01-01
The rate at which people process information appears to influence many aspects of cognition across the lifespan. However, many commonly accepted measures of “processing speed” may require goal maintenance, manipulation of information in working memory, and decision-making, blurring the distinction between processing speed and executive control and resulting in overestimation of processing-speed contributions to cognition. This concern may apply particularly to studies of developmental change, as even seemingly simple processing speed measures may require executive processes to keep children and older adults on task. We report two new studies and a re-analysis of a published study, testing predictions about how different processing speed measures influence conclusions about executive control across the life span. We find that the choice of processing speed measure affects the relationship observed between processing speed and executive control, in a manner that changes with age, and that choice of processing speed measure affects conclusions about development and the relationship among executive control measures. Implications for understanding processing speed, executive control, and their development are discussed. PMID:23432836
Quality improvement in neurology: AAN Parkinson disease quality measures
Cheng, E.M.; Tonn, S.; Swain-Eng, R.; Factor, S.A.; Weiner, W.J.; Bever, C.T.
2010-01-01
Background: Measuring the quality of health care is a fundamental step toward improving health care and is increasingly used in pay-for-performance initiatives and maintenance of certification requirements. Measure development to date has focused on primary care and common conditions such as diabetes; thus, the number of measures that apply to neurologic care is limited. The American Academy of Neurology (AAN) identified the need for neurologists to develop measures of neurologic care and to establish a process to accomplish this. Objective: To adapt and test the feasibility of a process for independent development by the AAN of measures for neurologic conditions for national measurement programs. Methods: A process that has been used nationally for measure development was adapted for use by the AAN. Topics for measure development are chosen based upon national priorities, available evidence base from a systematic literature search, gaps in care, and the potential impact for quality improvement. A panel composed of subject matter and measure development methodology experts oversees the development of the measures. Recommendation statements and their corresponding level of evidence are reviewed and considered for development into draft candidate measures. The candidate measures are refined by the expert panel during a 30-day public comment period and by review by the American Medical Association for Current Procedural Terminology (CPT) II codes. All final AAN measures are approved by the AAN Board of Directors. Results: Parkinson disease (PD) was chosen for measure development. A review of the medical literature identified 258 relevant recommendation statements. A 28-member panel approved 10 quality measures for PD that included full specifications and CPT II codes. Conclusion: The AAN has adapted a measure development process that is suitable for national measurement programs and has demonstrated its capability to independently develop quality measures. GLOSSARY AAN = American Academy of Neurology; ABPN = American Board of Psychiatry and Neurology; AMA = American Medical Association; CPT II = Current Procedural Terminology; PCPI = Physician Consortium for Performance Improvement; PD = Parkinson disease; PMAG = Performance Measurement Advisory Group; PQRI = Physician Quality Reporting Initiative; QMR = Quality Measurement and Reporting Subcommittee. PMID:21115958
NASA Technical Reports Server (NTRS)
Rey, Charles A.
1991-01-01
The development of high temperature containerless processing equipment and the design and evaluation of associated systems required for microgravity materials processing and property measurements are discussed. Efforts were directed towards the following task areas: design and development of a High Temperature Acoustic Levitator (HAL) for containerless processing and property measurements at high temperatures; testing of the HAL module to establish this technology for use as a positioning device for microgravity uses; construction and evaluation of a brassboard hot wall Acoustic Levitation Furnace; construction and evaluation of a noncontact temperature measurement (NCTM) system based on AGEMA thermal imaging camera; construction of a prototype Division of Amplitude Polarimetric Pyrometer for NCTM of levitated specimens; evaluation of and recommendations for techniques to control contamination in containerless materials processing chambers; and evaluation of techniques for heating specimens to high temperatures for containerless materials experimentation.
NASA Astrophysics Data System (ADS)
Rey, Charles A.
1991-03-01
The development of high temperature containerless processing equipment and the design and evaluation of associated systems required for microgravity materials processing and property measurements are discussed. Efforts were directed towards the following task areas: design and development of a High Temperature Acoustic Levitator (HAL) for containerless processing and property measurements at high temperatures; testing of the HAL module to establish this technology for use as a positioning device for microgravity uses; construction and evaluation of a brassboard hot wall Acoustic Levitation Furnace; construction and evaluation of a noncontact temperature measurement (NCTM) system based on AGEMA thermal imaging camera; construction of a prototype Division of Amplitude Polarimetric Pyrometer for NCTM of levitated specimens; evaluation of and recommendations for techniques to control contamination in containerless materials processing chambers; and evaluation of techniques for heating specimens to high temperatures for containerless materials experimentation.
Eremenco, Sonya; Pease, Sheryl; Mann, Sarah; Berry, Pamela
2017-01-01
This paper describes the rationale and goals of the Patient-Reported Outcome (PRO) Consortium's instrument translation process. The PRO Consortium has developed a number of novel PRO measures which are in the process of qualification by the U.S. Food and Drug Administration (FDA) for use in clinical trials where endpoints based on these measures would support product labeling claims. Given the importance of FDA qualification of these measures, the PRO Consortium's Process Subcommittee determined that a detailed linguistic validation (LV) process was necessary to ensure that all translations of Consortium-developed PRO measures are performed using a standardized approach with the rigor required to meet regulatory and pharmaceutical industry expectations, as well as having a clearly defined instrument translation process that the translation industry can support. The consensus process involved gathering information about current best practices from 13 translation companies with expertise in LV, consolidating the findings to generate a proposed process, and obtaining iterative feedback from the translation companies and PRO Consortium member firms on the proposed process in two rounds of review in order to update existing principles of good practice in LV and to provide sufficient detail for the translation process to ensure consistency across PRO Consortium measures, sponsors, and translation companies. The consensus development resulted in a 12-step process that outlines universal and country-specific new translation approaches, as well as country-specific adaptations of existing translations. The PRO Consortium translation process will play an important role in maintaining the validity of the data generated through these measures by ensuring that they are translated by qualified linguists following a standardized and rigorous process that reflects best practice.
Validating a Measure of Stages of Change in Career Development
ERIC Educational Resources Information Center
Hammond, Marie S.; Michael, Tony; Luke, Charles
2017-01-01
Research on the processes of change in career development has focused on developmental stages rather than processes. This manuscript reports on the development and validation of the stages of change-career development scale, adapted from McConnaughy, Prochaska, & Velicer (1983) measure of stages of change in psychotherapy. Data from 875…
Comparative Analysis of the Measurement of Total Instructional Alignment
ERIC Educational Resources Information Center
Kick, Laura C.
2013-01-01
In 2007, Lisa Carter created the Total Instructional Alignment system--a process that aligns standards, curriculum, assessment, and instruction. Employed in several hundred school systems, the TIA process is a successful professional development program. The researcher developed an instrument to measure the success of the TIA process with the…
WISE: Automated support for software project management and measurement. M.S. Thesis
NASA Technical Reports Server (NTRS)
Ramakrishnan, Sudhakar
1995-01-01
One important aspect of software development and IV&V is measurement. Unless a software development effort is measured in some way, it is difficult to judge the effectiveness of current efforts and predict future performances. Collection of metrics and adherence to a process are difficult tasks in a software project. Change activity is a powerful indicator of project status. Automated systems that can handle change requests, issues, and other process documents provide an excellent platform for tracking the status of the project. A World Wide Web based architecture is developed for (a) making metrics collection an implicit part of the software process, (b) providing metric analysis dynamically, (c) supporting automated tools that can complement current practices of in-process improvement, and (d) overcoming geographical barrier. An operational system (WISE) instantiates this architecture allowing for the improvement of software process in a realistic environment. The tool tracks issues in software development process, provides informal communication between the users with different roles, supports to-do lists (TDL), and helps in software process improvement. WISE minimizes the time devoted to metrics collection, analysis, and captures software change data. Automated tools like WISE focus on understanding and managing the software process. The goal is improvement through measurement.
Quality Measures for the Care of Patients with Narcolepsy
Krahn, Lois E.; Hershner, Shelley; Loeding, Lauren D.; Maski, Kiran P.; Rifkin, Daniel I.; Selim, Bernardo; Watson, Nathaniel F.
2015-01-01
The American Academy of Sleep Medicine (AASM) commissioned a Workgroup to develop quality measures for the care of patients with narcolepsy. Following a comprehensive literature search, 306 publications were found addressing quality care or measures. Strength of association was graded between proposed process measures and desired outcomes. Following the AASM process for quality measure development, we identified three outcomes (including one outcome measure) and seven process measures. The first desired outcome was to reduce excessive daytime sleepiness by employing two process measures: quantifying sleepiness and initiating treatment. The second outcome was to improve the accuracy of diagnosis by employing the two process measures: completing both a comprehensive sleep history and an objective sleep assessment. The third outcome was to reduce adverse events through three steps: ensuring treatment follow-up, documenting medical comorbidities, and documenting safety measures counseling. All narcolepsy measures described in this report were developed by the Narcolepsy Quality Measures Work-group and approved by the AASM Quality Measures Task Force and the AASM Board of Directors. The AASM recommends the use of these measures as part of quality improvement programs that will enhance the ability to improve care for patients with narcolepsy. Citation: Krahn LE, Hershner S, Loeding LD, Maski KP, Rifkin DI, Selim B, Watson NF. Quality measures for the care of patients with narcolepsy. J Clin Sleep Med 2015;11(3):335–355. PMID:25700880
Lau, Nathan; Jamieson, Greg A; Skraaning, Gyrd
2016-07-01
We introduce Process Overview, a situation awareness characterisation of the knowledge derived from monitoring process plants. Process Overview is based on observational studies of process control work in the literature. The characterisation is applied to develop a query-based measure called the Process Overview Measure. The goal of the measure is to improve coupling between situation and awareness according to process plant properties and operator cognitive work. A companion article presents the empirical evaluation of the Process Overview Measure in a realistic process control setting. The Process Overview Measure demonstrated sensitivity and validity by revealing significant effects of experimental manipulations that corroborated with other empirical results. The measure also demonstrated adequate inter-rater reliability and practicality for measuring SA based on data collected by process experts. Practitioner Summary: The Process Overview Measure is a query-based measure for assessing operator situation awareness from monitoring process plants in representative settings.
Rosas, Scott R; Ridings, John W
2017-02-01
The past decade has seen an increase of measurement development research in social and health sciences that featured the use of concept mapping as a core technique. The purpose, application, and utility of concept mapping have varied across this emerging literature. Despite the variety of uses and range of outputs, little has been done to critically review how researchers have approached the application of concept mapping in the measurement development and evaluation process. This article focuses on a review of the current state of practice regarding the use of concept mapping as methodological tool in this process. We systematically reviewed 23 scale or measure development and evaluation studies, and detail the application of concept mapping in the context of traditional measurement development and psychometric testing processes. Although several limitations surfaced, we found several strengths in the contemporary application of the method. We determined concept mapping provides (a) a solid method for establishing content validity, (b) facilitates researcher decision-making, (c) insight into target population perspectives that are integrated a priori, and (d) a foundation for analytical and interpretative choices. Based on these results, we outline how concept mapping can be situated in the measurement development and evaluation processes for new instrumentation. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Marculescu, Bogdan; Feldt, Robert; Torkar, Richard; Green, Lars-Goran; Liljegren, Thomas; Hult, Erika
2011-08-01
Verification and validation is an important part of software development and accounts for significant amounts of the costs associated with such a project. For developers of life or mission critical systems, such as software being developed for space applications, a balance must be reached between ensuring the quality of the system by extensive and rigorous testing and reducing costs and allowing the company to compete.Ensuring the quality of any system starts with a quality development process. To evaluate both the software development process and the product itself, measurements are needed. A balance must be then struck between ensuring the best possible quality of both process and product on the one hand, and reducing the cost of performing requirements on the other.A number of measurements have already been defined and are being used. For some of these, data collection can be automated as well, further lowering costs associated with implementing them. In practice, however, there may be situations where existing measurements are unsuitable for a variety of reasons.This paper describes a framework for creating low cost, flexible measurements in areas where initial information is scarce. The framework, called The Measurements Exploration Framework, is aimed in particular at the Space Software development industry and was developed is such an environment.
NASA Astrophysics Data System (ADS)
Yussup, N.; Ibrahim, M. M.; Rahman, N. A. A.; Mokhtar, M.; Salim, N. A. A.; Soh@Shaari, S. C.; Azman, A.; Lombigit, L.; Azman, A.; Omar, S. A.
2018-01-01
Most of the procedures in neutron activation analysis (NAA) process that has been established in Malaysian Nuclear Agency (Nuclear Malaysia) since 1980s were performed manually. These manual procedures carried out by the NAA laboratory personnel are time consuming and inefficient especially for sample counting and measurement process. The sample needs to be changed and the measurement software needs to be setup for every one hour counting time. Both of these procedures are performed manually for every sample. Hence, an automatic sample changer system (ASC) that consists of hardware and software is developed to automate sample counting process for up to 30 samples consecutively. This paper describes the ASC control software for NAA process which is designed and developed to control the ASC hardware and call GammaVision software for sample measurement. The software is developed by using National Instrument LabVIEW development package.
ERIC Educational Resources Information Center
Engeln-Maddox, Renee; Miller, Steven A.
2008-01-01
This article details the development of the Critical Processing of Beauty Images Scale (CPBI) and studies demonstrating the psychometric soundness of this measure. The CPBI measures women's tendency to engage in critical processing of media images featuring idealized female beauty. Three subscales were identified using exploratory factor analysis…
Development of real-time extensometer based on image processing
NASA Astrophysics Data System (ADS)
Adinanta, H.; Puranto, P.; Suryadi
2017-04-01
An extensometer system was developed by using high definition web camera as main sensor to track object position. The developed system applied digital image processing techniques. The image processing was used to measure the change of object position. The position measurement was done in real-time so that the system can directly showed the actual position in both x and y-axis. In this research, the relation between pixel and object position changes had been characterized. The system was tested by moving the target in a range of 20 cm in interval of 1 mm. To verify the long run performance, the stability and linearity of continuous measurements on both x and y-axis, this measurement had been conducted for 83 hours. The results show that this image processing-based extensometer had both good stability and linearity.
Safdari, Reza; Ghazisaeedi, Marjan; Mirzaee, Mahboobeh; Farzi, Jebrail; Goodini, Azadeh
2014-01-01
Dynamic reporting tools, such as dashboards, should be developed to measure emergency department (ED) performance. However, choosing an effective balanced set of performance measures and key performance indicators (KPIs) is a main challenge to accomplish this. The aim of this study was to develop a balanced set of KPIs for use in ED strategic dashboards following an analytic hierarchical process. The study was carried out in 2 phases: constructing ED performance measures based on balanced scorecard perspectives and incorporating them into analytic hierarchical process framework to select the final KPIs. The respondents placed most importance on ED internal processes perspective especially on measures related to timeliness and accessibility of care in ED. Some measures from financial, customer, and learning and growth perspectives were also selected as other top KPIs. Measures of care effectiveness and care safety were placed as the next priorities too. The respondents placed least importance on disease-/condition-specific "time to" measures. The methodology can be presented as a reference model for development of KPIs in various performance related areas based on a consistent and fair approach. Dashboards that are designed based on such a balanced set of KPIs will help to establish comprehensive performance measurements and fair benchmarks and comparisons.
An exploratory survey of methods used to develop measures of performance
NASA Astrophysics Data System (ADS)
Hamner, Kenneth L.; Lafleur, Charles A.
1993-09-01
Nonmanufacturing organizations are being challenged to provide high-quality products and services to their customers, with an emphasis on continuous process improvement. Measures of performance, referred to as metrics, can be used to foster process improvement. The application of performance measurement to nonmanufacturing processes can be very difficult. This research explored methods used to develop metrics in nonmanufacturing organizations. Several methods were formally defined in the literature, and the researchers used a two-step screening process to determine the OMB Generic Method was most likely to produce high-quality metrics. The OMB Generic Method was then used to develop metrics. A few other metric development methods were found in use at nonmanufacturing organizations. The researchers interviewed participants in metric development efforts to determine their satisfaction and to have them identify the strengths and weaknesses of, and recommended improvements to, the metric development methods used. Analysis of participants' responses allowed the researchers to identify the key components of a sound metrics development method. Those components were incorporated into a proposed metric development method that was based on the OMB Generic Method, and should be more likely to produce high-quality metrics that will result in continuous process improvement.
The development of a science process assessment for fourth-grade students
NASA Astrophysics Data System (ADS)
Smith, Kathleen A.; Welliver, Paul W.
In this study, a multiple-choice test entitled the Science Process Assessment was developed to measure the science process skills of students in grade four. Based on the Recommended Science Competency Continuum for Grades K to 6 for Pennsylvania Schools, this instrument measured the skills of (1) observing, (2) classifying, (3) inferring, (4) predicting, (5) measuring, (6) communicating, (7) using space/time relations, (8) defining operationally, (9) formulating hypotheses, (10) experimenting, (11) recognizing variables, (12) interpreting data, and (13) formulating models. To prepare the instrument, classroom teachers and science educators were invited to participate in two science education workshops designed to develop an item bank of test questions applicable to measuring process skill learning. Participants formed writing teams and generated 65 test items representing the 13 process skills. After a comprehensive group critique of each item, 61 items were identified for inclusion into the Science Process Assessment item bank. To establish content validity, the item bank was submitted to a select panel of science educators for the purpose of judging item acceptability. This analysis yielded 55 acceptable test items and produced the Science Process Assessment, Pilot 1. Pilot 1 was administered to 184 fourth-grade students. Students were given a copy of the test booklet; teachers read each test aloud to the students. Upon completion of this first administration, data from the item analysis yielded a reliability coefficient of 0.73. Subsequently, 40 test items were identified for the Science Process Assessment, Pilot 2. Using the test-retest method, the Science Process Assessment, Pilot 2 (Test 1 and Test 2) was administered to 113 fourth-grade students. Reliability coefficients of 0.80 and 0.82, respectively, were ascertained. The correlation between Test 1 and Test 2 was 0.77. The results of this study indicate that (1) the Science Process Assessment, Pilot 2, is a valid and reliable instrument applicable to measuring the science process skills of students in grade four, (2) using educational workshops as a means of developing item banks of test questions is viable and productive in the test development process, and (3) involving classroom teachers and science educators in the test development process is educationally efficient and effective.
Measuring the impact of computer resource quality on the software development process and product
NASA Technical Reports Server (NTRS)
Mcgarry, Frank; Valett, Jon; Hall, Dana
1985-01-01
The availability and quality of computer resources during the software development process was speculated to have measurable, significant impact on the efficiency of the development process and the quality of the resulting product. Environment components such as the types of tools, machine responsiveness, and quantity of direct access storage may play a major role in the effort to produce the product and in its subsequent quality as measured by factors such as reliability and ease of maintenance. During the past six years, the NASA Goddard Space Flight Center has conducted experiments with software projects in an attempt to better understand the impact of software development methodologies, environments, and general technologies on the software process and product. Data was extracted and examined from nearly 50 software development projects. All were related to support of satellite flight dynamics ground-based computations. The relationship between computer resources and the software development process and product as exemplified by the subject NASA data was examined. Based upon the results, a number of computer resource-related implications are provided.
The (mis)use of subjective process measures in software engineering
NASA Technical Reports Server (NTRS)
Valett, Jon D.; Condon, Steven E.
1993-01-01
A variety of measures are used in software engineering research to develop an understanding of the software process and product. These measures fall into three broad categories: quantitative, characteristics, and subjective. Quantitative measures are those to which a numerical value can be assigned, for example effort or lines of code (LOC). Characteristics describe the software process or product; they might include programming language or the type of application. While such factors do not provide a quantitative measurement of a process or product, they do help characterize them. Subjective measures (as defined in this study) are those that are based on the opinion or opinions of individuals; they are somewhat unique and difficult to quantify. Capturing of subjective measure data typically involves development of some type of scale. For example, 'team experience' is one of the subjective measures that were collected and studied by the Software Engineering Laboratory (SEL). Certainly, team experience could have an impact on the software process or product; actually measuring a team's experience, however, is not a strictly mathematical exercise. Simply adding up each team member's years of experience appears inadequate. In fact, most researchers would agree that 'years' do not directly translate into 'experience.' Team experience must be defined subjectively and then a scale must be developed e.g., high experience versus low experience; or high, medium, low experience; or a different or more granular scale. Using this type of scale, a particular team's overall experience can be compared with that of other teams in the development environment. Defining, collecting, and scaling subjective measures is difficult. First, precise definitions of the measures must be established. Next, choices must be made about whose opinions will be solicited to constitute the data. Finally, care must be given to defining the right scale and level of granularity for measurement.
Using Performance Measures in the Federal Budget Process
1993-07-01
developed in private sector , the primary measure of perfor- three different ways. The first is by budgeting mance for an organization as a whole is...example, a mass transit agency that in the private sector . This subjectivity has led stresses efficiency as a primary value would to the development of more...Public Sector 1 Obstacles to Measuring the Performance of Federal Agencies 3 How Can Performance Measurement Affect the Budget Process? 7 Conclusion 9
Performance measurement for information systems: Industry perspectives
NASA Technical Reports Server (NTRS)
Bishop, Peter C.; Yoes, Cissy; Hamilton, Kay
1992-01-01
Performance measurement has become a focal topic for information systems (IS) organizations. Historically, IS performance measures have dealt with the efficiency of the data processing function. Today, the function of most IS organizations goes beyond simple data processing. To understand how IS organizations have developed meaningful performance measures that reflect their objectives and activities, industry perspectives on IS performance measurement was studied. The objectives of the study were to understand the state of the practice in IS performance techniques for IS performance measurement; to gather approaches and measures of actual performance measures used in industry; and to report patterns, trends, and lessons learned about performance measurement to NASA/JSC. Examples of how some of the most forward looking companies are shaping their IS processes through measurement is provided. Thoughts on the presence of a life-cycle to performance measures development and a suggested taxonomy for performance measurements are included in the appendices.
Measuring health care process quality with software quality measures.
Yildiz, Ozkan; Demirörs, Onur
2012-01-01
Existing quality models focus on some specific diseases, clinics or clinical areas. Although they contain structure, process, or output type measures, there is no model which measures quality of health care processes comprehensively. In addition, due to the not measured overall process quality, hospitals cannot compare quality of processes internally and externally. To bring a solution to above problems, a new model is developed from software quality measures. We have adopted the ISO/IEC 9126 software quality standard for health care processes. Then, JCIAS (Joint Commission International Accreditation Standards for Hospitals) measurable elements were added to model scope for unifying functional requirements. Assessment (diagnosing) process measurement results are provided in this paper. After the application, it was concluded that the model determines weak and strong aspects of the processes, gives a more detailed picture for the process quality, and provides quantifiable information to hospitals to compare their processes with multiple organizations.
An Adaptive Kalman Filter Using a Simple Residual Tuning Method
NASA Technical Reports Server (NTRS)
Harman, Richard R.
1999-01-01
One difficulty in using Kalman filters in real world situations is the selection of the correct process noise, measurement noise, and initial state estimate and covariance. These parameters are commonly referred to as tuning parameters. Multiple methods have been developed to estimate these parameters. Most of those methods such as maximum likelihood, subspace, and observer Kalman Identification require extensive offline processing and are not suitable for real time processing. One technique, which is suitable for real time processing, is the residual tuning method. Any mismodeling of the filter tuning parameters will result in a non-white sequence for the filter measurement residuals. The residual tuning technique uses this information to estimate corrections to those tuning parameters. The actual implementation results in a set of sequential equations that run in parallel with the Kalman filter. A. H. Jazwinski developed a specialized version of this technique for estimation of process noise. Equations for the estimation of the measurement noise have also been developed. These algorithms are used to estimate the process noise and measurement noise for the Wide Field Infrared Explorer star tracker and gyro.
In-depth analysis and characterization of a dual damascene process with respect to different CD
NASA Astrophysics Data System (ADS)
Krause, Gerd; Hofmann, Detlef; Habets, Boris; Buhl, Stefan; Gutsch, Manuela; Lopez-Gomez, Alberto; Kim, Wan-Soo; Thrun, Xaver
2018-03-01
In a 200 mm high volume environment, we studied data from a dual damascene process. Dual damascene is a combination of lithography, etch and CMP that is used to create copper lines and contacts in one single step. During these process steps, different metal CD are measured by different measurement methods. In this study, we analyze the key numbers of the different measurements after different process steps and develop simple models to predict the electrical behavior* . In addition, radial profiles have been analyzed of both inline measurement parameters and electrical parameters. A matching method was developed based on inline and electrical data. Finally, correlation analysis for radial signatures is presented that can be used to predict excursions in electrical signatures.
The Validation by Measurement Theory of Proposed Object-Oriented Software Metrics
NASA Technical Reports Server (NTRS)
Neal, Ralph D.
1996-01-01
Moving software development into the engineering arena requires controllability, and to control a process, it must be measurable. Measuring the process does no good if the product is not also measured, i.e., being the best at producing an inferior product does not define a quality process. Also, not every number extracted from software development is a valid measurement. A valid measurement only results when we are able to verify that the number is representative of the attribute that we wish to measure. Many proposed software metrics are used by practitioners without these metrics ever having been validated, leading to costly but often useless calculations. Several researchers have bemoaned the lack of scientific precision in much of the published software measurement work and have called for validation of software metrics by measurement theory. This dissertation applies measurement theory to validate fifty proposed object-oriented software metrics.
ERIC Educational Resources Information Center
Klaczynski, Paul A.; Fauth, James M.; Swanger, Amy
1998-01-01
The extent to which adolescents rely on rational versus experiential information processing was studied with 49 adolescents administered multiple measures of formal operations, two critical thinking questionnaires, a measure of rational processing, and a measure of ego identity status. Implications for studies of development are discussed in terms…
Thinking through cancer risk: characterizing smokers' process of risk determination.
Hay, Jennifer; Shuk, Elyse; Cruz, Gustavo; Ostroff, Jamie
2005-10-01
The perception of cancer risk motivates cancer risk reduction behaviors. However, common measurement strategies for cancer risk perceptions, which involve numerical likelihood estimates, do not adequately capture individuals' thoughts and feelings about cancer risk. To guide the development of novel measurement strategies, the authors used semistructured interviews to examine the thought processes used by smokers (N = 15) as they considered their cancer risk. They used grounded theory to guide systematic data coding and develop a heuristic model describing smokers' risk perception process that includes a cognitive, primarily rational process whereby salient personal risk factors for cancer are considered and combined, and an affective/attitudinal process, which shifts risk perceptions either up or down. The model provides a tentative explanation concerning how people hold cancer risk perceptions that diverge from rational assessment of their risks and will be useful in guiding the development of non-numerical measurements strategies for cancer risk perceptions.
Developing core outcome measurement sets for clinical trials: OMERACT filter 2.0.
Boers, Maarten; Kirwan, John R; Wells, George; Beaton, Dorcas; Gossec, Laure; d'Agostino, Maria-Antonietta; Conaghan, Philip G; Bingham, Clifton O; Brooks, Peter; Landewé, Robert; March, Lyn; Simon, Lee S; Singh, Jasvinder A; Strand, Vibeke; Tugwell, Peter
2014-07-01
Lack of standardization of outcome measures limits the usefulness of clinical trial evidence to inform health care decisions. This can be addressed by agreeing on a minimum core set of outcome measures per health condition, containing measures relevant to patients and decision makers. Since 1992, the Outcome Measures in Rheumatology (OMERACT) consensus initiative has successfully developed core sets for many rheumatologic conditions, actively involving patients since 2002. Its expanding scope required an explicit formulation of its underlying conceptual framework and process. Literature searches and iterative consensus process (surveys and group meetings) of stakeholders including patients, health professionals, and methodologists within and outside rheumatology. To comprehensively sample patient-centered and intervention-specific outcomes, a framework emerged that comprises three core "Areas," namely Death, Life Impact, and Pathophysiological Manifestations; and one strongly recommended Resource Use. Through literature review and consensus process, core set development for any specific health condition starts by identifying at least one core "Domain" within each of the Areas to formulate the "Core Domain Set." Next, at least one applicable measurement instrument for each core Domain is identified to formulate a "Core Outcome Measurement Set." Each instrument must prove to be truthful (valid), discriminative, and feasible. In 2012, 96% of the voting participants (n=125) at the OMERACT 11 consensus conference endorsed this model and process. The OMERACT Filter 2.0 explicitly describes a comprehensive conceptual framework and a recommended process to develop core outcome measurement sets for rheumatology likely to be useful as a template in other areas of health care. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.
Sevenster, M; Buurman, J; Liu, P; Peters, J F; Chang, P J
2015-01-01
Accumulating quantitative outcome parameters may contribute to constructing a healthcare organization in which outcomes of clinical procedures are reproducible and predictable. In imaging studies, measurements are the principal category of quantitative para meters. The purpose of this work is to develop and evaluate two natural language processing engines that extract finding and organ measurements from narrative radiology reports and to categorize extracted measurements by their "temporality". The measurement extraction engine is developed as a set of regular expressions. The engine was evaluated against a manually created ground truth. Automated categorization of measurement temporality is defined as a machine learning problem. A ground truth was manually developed based on a corpus of radiology reports. A maximum entropy model was created using features that characterize the measurement itself and its narrative context. The model was evaluated in a ten-fold cross validation protocol. The measurement extraction engine has precision 0.994 and recall 0.991. Accuracy of the measurement classification engine is 0.960. The work contributes to machine understanding of radiology reports and may find application in software applications that process medical data.
Space Station Application of Simulator-Developed Aircrew Coordination and Performance Measures
NASA Technical Reports Server (NTRS)
Murphy, Miles
1985-01-01
This paper summarizes a study in progress at NASA/Ames Research Center to develop measures of aircrew coordination and decision-making factors and to relate them to flight task performance, that is, to crew and system performance measures. The existence of some similar interpersonal process and task performance requirements suggests a potential application of these methods in space station crew research -- particularly research conducted in ground-based mock-ups. The secondary objective of this study should also be of interest: to develop information on crew process and performance for application in developing crew training programs.
NASA Astrophysics Data System (ADS)
Tower, Joshua P.; Kamieniecki, Emil; Nguyen, M. C.; Danel, Adrien
1999-08-01
The Surface Charge Profiler (SCP) has been introduced for monitoring and development of silicon epitaxial processes. The SCP measures the near-surface doping concentration and offers advantages that lead to yield enhancement in several ways. First, non-destructive measurement technology enables in-line process monitoring, eliminating the need to sacrifice production wafers for resistivity measurements. Additionally, the full-wafer mapping capability helps in development of improved epitaxial growth processes and early detection of reactor problems. As examples, we present the use of SCP to study the effects of susceptor degradation in barrel reactors and to study autodoping for development of improved dopant uniformity.
Neural Network Modeling for Gallium Arsenide IC Fabrication Process and Device Characteristics.
NASA Astrophysics Data System (ADS)
Creech, Gregory Lee, I.
This dissertation presents research focused on the utilization of neurocomputing technology to achieve enhanced yield and effective yield prediction in integrated circuit (IC) manufacturing. Artificial neural networks are employed to model complex relationships between material and device characteristics at critical stages of the semiconductor fabrication process. Whole wafer testing was performed on the starting substrate material and during wafer processing at four critical steps: Ohmic or Post-Contact, Post-Recess, Post-Gate and Final, i.e., at completion of fabrication. Measurements taken and subsequently used in modeling include, among others, doping concentrations, layer thicknesses, planar geometries, layer-to-layer alignments, resistivities, device voltages, and currents. The neural network architecture used in this research is the multilayer perceptron neural network (MLPNN). The MLPNN is trained in the supervised mode using the generalized delta learning rule. It has one hidden layer and uses continuous perceptrons. The research focuses on a number of different aspects. First is the development of inter-process stage models. Intermediate process stage models are created in a progressive fashion. Measurements of material and process/device characteristics taken at a specific processing stage and any previous stages are used as input to the model of the next processing stage characteristics. As the wafer moves through the fabrication process, measurements taken at all previous processing stages are used as input to each subsequent process stage model. Secondly, the development of neural network models for the estimation of IC parametric yield is demonstrated. Measurements of material and/or device characteristics taken at earlier fabrication stages are used to develop models of the final DC parameters. These characteristics are computed with the developed models and compared to acceptance windows to estimate the parametric yield. A sensitivity analysis is performed on the models developed during this yield estimation effort. This is accomplished by analyzing the total disturbance of network outputs due to perturbed inputs. When an input characteristic bears no, or little, statistical or deterministic relationship to the output characteristics, it can be removed as an input. Finally, neural network models are developed in the inverse direction. Characteristics measured after the final processing step are used as the input to model critical in-process characteristics. The modeled characteristics are used for whole wafer mapping and its statistical characterization. It is shown that this characterization can be accomplished with minimal in-process testing. The concepts and methodologies used in the development of the neural network models are presented. The modeling results are provided and compared to the actual measured values of each characteristic. An in-depth discussion of these results and ideas for future research are presented.
Performance measurement: integrating quality management and activity-based cost management.
McKeon, T
1996-04-01
The development of an activity-based management system provides a framework for developing performance measures integral to quality and cost management. Performance measures that cross operational boundaries and embrace core processes provide a mechanism to evaluate operational results related to strategic intention and internal and external customers. The author discusses this measurement process that allows managers to evaluate where they are and where they want to be, and to set a course of action that closes the gap between the two.
Automated measurement of pressure injury through image processing.
Li, Dan; Mathews, Carol
2017-11-01
To develop an image processing algorithm to automatically measure pressure injuries using electronic pressure injury images stored in nursing documentation. Photographing pressure injuries and storing the images in the electronic health record is standard practice in many hospitals. However, the manual measurement of pressure injury is time-consuming, challenging and subject to intra/inter-reader variability with complexities of the pressure injury and the clinical environment. A cross-sectional algorithm development study. A set of 32 pressure injury images were obtained from a western Pennsylvania hospital. First, we transformed the images from an RGB (i.e. red, green and blue) colour space to a YC b C r colour space to eliminate inferences from varying light conditions and skin colours. Second, a probability map, generated by a skin colour Gaussian model, guided the pressure injury segmentation process using the Support Vector Machine classifier. Third, after segmentation, the reference ruler - included in each of the images - enabled perspective transformation and determination of pressure injury size. Finally, two nurses independently measured those 32 pressure injury images, and intraclass correlation coefficient was calculated. An image processing algorithm was developed to automatically measure the size of pressure injuries. Both inter- and intra-rater analysis achieved good level reliability. Validation of the size measurement of the pressure injury (1) demonstrates that our image processing algorithm is a reliable approach to monitoring pressure injury progress through clinical pressure injury images and (2) offers new insight to pressure injury evaluation and documentation. Once our algorithm is further developed, clinicians can be provided with an objective, reliable and efficient computational tool for segmentation and measurement of pressure injuries. With this, clinicians will be able to more effectively monitor the healing process of pressure injuries. © 2017 John Wiley & Sons Ltd.
Proceedings of the Fifteenth Annual Software Engineering Workshop
NASA Technical Reports Server (NTRS)
1990-01-01
The Software Engineering Laboratory (SEL) is an organization sponsored by GSFC and created for the purpose of investigating the effectiveness of software engineering technologies when applied to the development of applications software. The goals of the SEL are: (1) to understand the software development process in the GSFC environment; (2) to measure the effect of various methodologies, tools, and models on this process; and (3) to identify and then to apply successful development practices. Fifteen papers were presented at the Fifteenth Annual Software Engineering Workshop in five sessions: (1) SEL at age fifteen; (2) process improvement; (3) measurement; (4) reuse; and (5) process assessment. The sessions were followed by two panel discussions: (1) experiences in implementing an effective measurement program; and (2) software engineering in the 1980's. A summary of the presentations and panel discussions is given.
Richler, Jennifer J.; Floyd, R. Jackie; Gauthier, Isabel
2014-01-01
Efforts to understand individual differences in high-level vision necessitate the development of measures that have sufficient reliability, which is generally not a concern in group studies. Holistic processing is central to research on face recognition and, more recently, to the study of individual differences in this area. However, recent work has shown that the most popular measure of holistic processing, the composite task, has low reliability. This is particularly problematic for the recent surge in interest in studying individual differences in face recognition. Here, we developed and validated a new measure of holistic face processing specifically for use in individual-differences studies. It avoids some of the pitfalls of the standard composite design and capitalizes on the idea that trial variability allows for better traction on reliability. Across four experiments, we refine this test and demonstrate its reliability. PMID:25228629
Measurement-based reliability/performability models
NASA Technical Reports Server (NTRS)
Hsueh, Mei-Chen
1987-01-01
Measurement-based models based on real error-data collected on a multiprocessor system are described. Model development from the raw error-data to the estimation of cumulative reward is also described. A workload/reliability model is developed based on low-level error and resource usage data collected on an IBM 3081 system during its normal operation in order to evaluate the resource usage/error/recovery process in a large mainframe system. Thus, both normal and erroneous behavior of the system are modeled. The results provide an understanding of the different types of errors and recovery processes. The measured data show that the holding times in key operational and error states are not simple exponentials and that a semi-Markov process is necessary to model the system behavior. A sensitivity analysis is performed to investigate the significance of using a semi-Markov process, as opposed to a Markov process, to model the measured system.
Implementing a Process to Measure Return on Investment for Nursing Professional Development.
Garrison, Elisabeth; Beverage, Jodie
Return on investment (ROI) is one way to quantify the value that nursing professional development brings to the organization. This article describes a process to begin tracking ROI for nursing professional development. Implementing a process of tracking nursing professional development practitioners' ROI increased awareness of the financial impact and effectiveness of the department.
NASA Astrophysics Data System (ADS)
Yin, Bo; Liu, Li; Wang, Jiahan; Li, Xiran; Liu, Zhenbo; Li, Dewei; Wang, Jun; Liu, Lu; Wu, Jun; Xu, Tingting; Cui, He
2017-10-01
Electric energy measurement as a basic work, an accurate measurements play a vital role for the economic interests of both parties of power supply, the standardized management of the measurement laboratory at all levels is a direct factor that directly affects the fairness of measurement. Currently, the management of metering laboratories generally uses one-dimensional bar code as the recognition object, advances the testing process by manual management, most of the test data requires human input to generate reports. There are many problems and potential risks in this process: Data cannot be saved completely, cannot trace the status of inspection, the inspection process isn't completely controllable and so on. For the provincial metrology center's actual requirements of the whole process management for the performance test of the power measuring appliances, using of large-capacity RF tags as a process management information media, we developed a set of general measurement experiment management system, formulated a standardized full performance test process, improved the raw data recording mode of experimental process, developed a storehouse automatic inventory device, established a strict test sample transfer and storage system, ensured that all the raw data of the inspection can be traced back, achieved full life-cycle control of the sample, significantly improved the quality control level and the effectiveness of inspection work.
Do schema processes mediate links between parenting and eating pathology?
Sheffield, Alex; Waller, Glenn; Emanuelli, Francesca; Murray, James; Meyer, Caroline
2009-07-01
Adverse parenting experiences are commonly linked to eating pathology. A schema-based model of the development and maintenance of eating pathology proposes that one of the potential mediators of the link between parenting and eating pathology might be the development of schema maintenance processes--mechanisms that operate to help the individual avoid intolerable emotions. To test this hypothesis, 353 female students and 124 female eating-disordered clients were recruited. They completed a measure of perceived parenting experiences as related to schema development (Young Parenting Inventory-Revised (YPI-R)), two measures of schema processes (Young Compensatory Inventory; Young-Rygh Avoidance Inventory (YRAI)) and a measure of eating pathology (Eating Disorders Inventory (EDI)). In support of the hypothesis, certain schema processes did mediate the relationship between specific perceptions of parenting and particular forms of eating pathology, although these were different for the clinical and non-clinical samples. In those patients where parenting is implicated in the development of eating pathology, treatment might need to target the cognitive processes that can explain this link. 2009 John Wiley & Sons, Ltd and Eating Disorders Association
Image processing and analysis of Saturn's rings
NASA Technical Reports Server (NTRS)
Yagi, G. M.; Jepsen, P. L.; Garneau, G. W.; Mosher, J. A.; Doyle, L. R.; Lorre, J. J.; Avis, C. C.; Korsmo, E. P.
1981-01-01
Processing of Voyager image data of Saturn's rings at JPL's Image Processing Laboratory is described. A software system to navigate the flight images, facilitate feature tracking, and to project the rings has been developed. This system has been used to make measurements of ring radii and to measure the velocities of the spoke features in the B-Ring. A projected ring movie to study the development of these spoke features has been generated. Finally, processing to facilitate comparison of the photometric properties of Saturn's rings at various phase angles is described.
Cognitive Process Development as Measured by an Adapted Version of Wechsler's Similarities Test
ERIC Educational Resources Information Center
Rozencwajg, Paulette
2007-01-01
This paper studies the development of taxonomic processing as measured by an adapted version of the Wechsler Similarities subtest, which distinguishes between categorization of concrete and abstract words. Two factors--age and concreteness--are also tested by a recall task. The results show an age-related increase in taxonomic categorization,…
ERIC Educational Resources Information Center
Labin, Susan N.
2014-01-01
A fundamental reason for doing evaluation capacity building (ECB) is to improve program outcomes. Developing common measures of outcomes and the activities, processes, and factors that lead to these outcomes is an important step in moving the science and the practice of ECB forward. This article identifies a number of existing ECB measurement…
Modeling spray/puddle dissolution processes for deep-ultraviolet acid-hardened resists
NASA Astrophysics Data System (ADS)
Hutchinson, John M.; Das, Siddhartha; Qian, Qi-De; Gaw, Henry T.
1993-10-01
A study of the dissolution behavior of acid-hardened resists (AHR) was undertaken for spray and spray/puddle development processes. The Site Services DSM-100 end-point detection system is used to measure both spray and puddle dissolution data for a commercially available deep-ultraviolet AHR resist, Shipley SNR-248. The DSM allows in situ measurement of dissolution rate on the wafer chuck and hence allows parameter extraction for modeling spray and puddle processes. The dissolution data for spray and puddle processes was collected across a range of exposure dose and postexposure bake temperature. The development recipe was varied to decouple the contribution of the spray and puddle modes to the overall dissolution characteristics. The mechanisms involved in spray versus puddle dissolution and the impact of spray versus puddle dissolution on process performance metrics has been investigated. We used the effective-dose-modeling approach and the measurement capability of the DSM-100 and developed a lumped parameter model for acid-hardened resists that incorporates the effects of exposure, postexposure bake temperature and time, and development condition. The PARMEX photoresist-modeling program is used to determine parameters for the spray and for the puddle process. The lumped parameter AHR model developed showed good agreement with experimental data.
Measuring the wetting angle and perimeter of single wood pulp fibers : a modified method
John H. Klungness
1981-01-01
In pulp processing development it is often necessary to measure the effect of a process variable on individual pulp fiber wettability. Such processes would include drying of market pulps, recycling of secondary fibers, and surface modification of fibers as in sizing. However, if wettability is measured on a fiber sheet surface, the results are confounded by...
Lexical Development during Middle Infancy: A Mutually Driven Infant-Caregiver Process.
ERIC Educational Resources Information Center
Dunham, Philip; Dunham, Frances
1992-01-01
Mothers' utterances were measured during interactions with their 13-month-old infants and correlated with measures of infants' productive lexical development at 13 and 24 months. Correlations between maternal measures and infants' lexical development were lower for employed mothers than for mothers who were full-time caregivers. (BC)
Magasi, Susan; Harniss, Mark; Heinemann, Allen W
2018-01-01
Principles of fairness in testing require that all test takers, including people with disabilities, have an equal opportunity to demonstrate their capacity on the construct being measured. Measurement design features and assessment protocols can pose barriers for people with disabilities. Fairness in testing is a fundamental validity issue at all phases in the design, administration, and interpretation of measurement instruments in clinical practice and research. There is limited guidance for instrument developers on how to develop and evaluate the accessibility and usability of measurement instruments. This article describes a 6-stage iterative process for developing accessible computer-administered measurement instruments grounded in the procedures implemented across several major measurement initiatives. A key component of this process is interdisciplinary teams of accessibility experts, content and measurement experts, information technology experts, and people with disabilities working together to ensure that measurement instruments are accessible and usable by a wide range of users. The development of accessible measurement instruments is not only an ethical requirement, it also ensures better science by minimizing measurement bias, missing data, and attrition due to mismatches between the target population and test administration platform and protocols. Copyright © 2017 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.
The Logic of Quantum Measurements
NASA Astrophysics Data System (ADS)
Vanni, Leonardo; Laura, Roberto
2013-07-01
We apply our previously developed formalism of contexts of histories, suitable to deal with quantum properties at different times, to the measurement process. We explore the logical implications which are allowed by the quantum theory, about the realization of properties of the microscopic measured system, before and after the measurement process with a given pointer value.
The validation by measurement theory of proposed object-oriented software metrics
NASA Technical Reports Server (NTRS)
Neal, Ralph D.
1994-01-01
Moving software development into the engineering arena requires controllability, and to control a process, it must be measurable. Measuring the process does no good if the product is not also measured, i.e., being the best at producing an inferior product does not define a quality process. Also, not every number extracted from software development is a valid measurement. A valid measurement only results when we are able to verify that the number is representative of the attribute that we wish to measure. Many proposed software metrics are used by practitioners without these metrics ever having been validated, leading to costly but often useless calculations. Several researchers have bemoaned the lack of scientific precision in much of the published software measurement work and have called for validation of software metrics by measurement theory. This dissertation applies measurement theory to validate fifty proposed object-oriented software metrics (Li and Henry, 1993; Chidamber and Kemerrer, 1994; Lorenz and Kidd, 1994).
New portable instrument for the measurement of thermal conductivity in gas process conditions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Queirós, C. S. G. P.; Lourenço, M. J. V., E-mail: mjlourenco@fc.ul.pt; Vieira, S. I.
The development of high temperature gas sensors for the monitoring and determination of thermophysical properties of complex process mixtures at high temperatures faces several problems, related with the materials compatibility, active sensing parts sensitivity, and lifetime. Ceramic/thin metal films based sensors, previously developed for the determination of thermal conductivity of molten materials up to 1200 °C, were redesigned, constructed, and applied for thermal conductivity measuring sensors. Platinum resistance thermometers were also developed using the same technology, to be used in the temperature measurement, which were also constructed and tested. A new data acquisition system for the thermal conductivity sensors, based onmore » a linearization of the transient hot-strip model, including a portable electronic bridge for the measurement of the thermal conductivity in gas process conditions was also developed. The equipment is capable of measuring the thermal conductivity of gaseous phases with an accuracy of 2%-5% up to 840 °C (95% confidence level). The development of sensors up to 1200 °C, present at the core of the combustion chambers, will be done in a near future.« less
Measuring disaster-resilient communities: a case study of coastal communities in Indonesia.
Kafle, Shesh Kanta
2012-01-01
Vulnerability reduction and resilience building of communities are central concepts in recent policy debates. Although there are fundamental linkages, and complementarities exist between the two concepts, recent policy and programming has focused more on the latter. It is assumed here that reducing underlying causes of vulnerabilities and their interactions with resilience elements is a prerequisite for obtaining resilience capabilities. An integrated approach, incorporating both the vulnerability and resilience considerations, has been taken while developing an index for measuring disaster-resilient communities. This study outlines a method for measuring community resilience capabilities using process and outcome indicators in 43 coastal communities in Indonesia. An index was developed using ten process and 25 outcome indicators, selected on the basis of the ten steps of the Integrated Community Based Risk Reduction (ICBRR) process, and key characteristics of disaster resilient communities were taken from various literatures. The overall index value of all 43 communities was 63, whereas the process and outcome indicator values were measured as 63 and 61.5 respectively. The core components of this index are process and outcome indicators. The tool has been developed with an assumption that both the process and outcome indicators are equally important in building disaster-resilient communities. The combination of both indicators is an impetus to quality change in the community. Process indicators are important for community understanding, ownership and the sustainability of the programme; whereas outcome indicators are important for the real achievements in terms of community empowerment and capacity development. The process of ICBRR approach varies by country and location as per the level of community awareness and organisational strategy. However, core elements such as the formation of community groups, mobilising those groups in risk assessment and planning should be present in all the countries or locations. As this study shows, community resiliency can be measured but any such measurement must be both location- and hazard-specific.
Registration of surface structures using airborne focused ultrasound.
Sundström, N; Börjesson, P O; Holmer, N G; Olsson, L; Persson, H W
1991-01-01
A low-cost measuring system, based on a personal computer combined with standard equipment for complex measurements and signal processing, has been assembled. Such a system increases the possibilities for small hospitals and clinics to finance advanced measuring equipment. A description of equipment developed for airborne ultrasound together with a personal computer-based system for fast data acquisition and processing is given. Two air-adapted ultrasound transducers with high lateral resolution have been developed. Furthermore, a few results for fast and accurate estimation of signal arrival time are presented. The theoretical estimation models developed are applied to skin surface profile registrations.
Interactive information processing for NASA's mesoscale analysis and space sensor program
NASA Technical Reports Server (NTRS)
Parker, K. G.; Maclean, L.; Reavis, N.; Wilson, G.; Hickey, J. S.; Dickerson, M.; Karitani, S.; Keller, D.
1985-01-01
The Atmospheric Sciences Division (ASD) of the Systems Dynamics Laboratory at NASA's Marshall Space Flight Center (MSFC) is currently involved in interactive information processing for the Mesoscale Analysis and Space Sensor (MASS) program. Specifically, the ASD is engaged in the development and implementation of new space-borne remote sensing technology to observe and measure mesoscale atmospheric processes. These space measurements and conventional observational data are being processed together to gain an improved understanding of the mesoscale structure and the dynamical evolution of the atmosphere relative to cloud development and precipitation processes. To satisfy its vast data processing requirements, the ASD has developed a Researcher Computer System consiting of three primary computer systems which provides over 20 scientists with a wide range of capabilities for processing and displaying a large volumes of remote sensing data. Each of the computers performs a specific function according to its unique capabilities.
2013-01-01
Background Although research interest in hospital process orientation (HPO) is growing, the development of a measurement tool to assess process orientation (PO) has not been very successful yet. To view a hospital as a series of processes organized around patients with a similar demand seems to be an attractive proposition, but it is hard to operationalize this idea in a measurement tool that can actually measure the level of PO. This research contributes to HPO from an operations management (OM) perspective by addressing the alignment, integration and coordination of activities within patient care processes. The objective of this study was to develop and practically test a new measurement tool for assessing the degree of PO within hospitals using existing tools. Methods Through a literature search we identified a number of constructs to measure PO in hospital settings. These constructs were further operationalized, using an OM perspective. Based on five dimensions of an existing questionnaire a new HPO-measurement tool was developed to measure the degree of PO within hospitals on the basis of respondents’ perception. The HPO-measurement tool was pre-tested in a non-participating hospital and discussed with experts in a focus group. The multicentre exploratory case study was conducted in the ophthalmic practices of three different types of Dutch hospitals. In total 26 employees from three disciplines participated. After filling in the questionnaire an interview was held with each participant to check the validity and the reliability of the measurement tool. Results The application of the HPO-measurement tool, analysis of the scores and interviews with the participants resulted in the possibility to identify differences of PO performance and the areas of improvement – from a PO point of view – within each hospital. The result of refinement of the items of the measurement tool after practical testing is a set of 41 items to assess the degree of PO from an OM perspective within hospitals. Conclusions The development and practically testing of a new HPO-measurement tool improves the understanding and application of PO in hospitals and the reliability of the measurement tool. The study shows that PO is a complex concept and appears still hard to objectify. PMID:24219362
Gonçalves, Pedro D; Hagenbeek, Marie Louise; Vissers, Jan M H
2013-11-13
Although research interest in hospital process orientation (HPO) is growing, the development of a measurement tool to assess process orientation (PO) has not been very successful yet. To view a hospital as a series of processes organized around patients with a similar demand seems to be an attractive proposition, but it is hard to operationalize this idea in a measurement tool that can actually measure the level of PO. This research contributes to HPO from an operations management (OM) perspective by addressing the alignment, integration and coordination of activities within patient care processes. The objective of this study was to develop and practically test a new measurement tool for assessing the degree of PO within hospitals using existing tools. Through a literature search we identified a number of constructs to measure PO in hospital settings. These constructs were further operationalized, using an OM perspective. Based on five dimensions of an existing questionnaire a new HPO-measurement tool was developed to measure the degree of PO within hospitals on the basis of respondents' perception. The HPO-measurement tool was pre-tested in a non-participating hospital and discussed with experts in a focus group. The multicentre exploratory case study was conducted in the ophthalmic practices of three different types of Dutch hospitals. In total 26 employees from three disciplines participated. After filling in the questionnaire an interview was held with each participant to check the validity and the reliability of the measurement tool. The application of the HPO-measurement tool, analysis of the scores and interviews with the participants resulted in the possibility to identify differences of PO performance and the areas of improvement--from a PO point of view--within each hospital. The result of refinement of the items of the measurement tool after practical testing is a set of 41 items to assess the degree of PO from an OM perspective within hospitals. The development and practically testing of a new HPO-measurement tool improves the understanding and application of PO in hospitals and the reliability of the measurement tool. The study shows that PO is a complex concept and appears still hard to objectify.
The development of Music in Dementia Assessment Scales (MiDAS)
McDermott, Orii; Orrell, Martin; Ridder, Hanne Mette
2015-01-01
There is a need to develop an outcome measure specific to music therapy in dementia that reflects a holistic picture of the therapy process and outcome. This study aimed to develop a clinically relevant and scientifically robust music therapy outcome measure incorporating the values and views of people with dementia. Focus groups and interviews were conducted to obtain qualitative data on what music meant to people with dementia and the observed effects of music. Expert and peer consultations were conducted at each stage of the measure development to maximise its content validity. The new measure was field-tested by clinicians in a care home. Feedback from the clinicians and music therapy experts were incorporated during the review and refinement process of the measure. A review of the existing literature, the experiential results and the consensus process enabled the development of the new outcome measure “Music in Dementia Assessment Scales (MiDAS)”. Analysis of the qualitative data identified five key areas of the impact of music on people with dementia and they were transformed as the five Visual Analogue Scale (VAS) items: levels of Interest, Response, Initiation, Involvement and Enjoyment. MiDAS comprises the five VAS items and a supplementary checklist of notable positive and negative reactions from the individual. This study demonstrates that it is possible to design and develop an easy to apply and rigorous quantitative outcome measure which has a high level of clinical relevance for people with dementia, care home staff and music therapists. PMID:26246670
The development of Music in Dementia Assessment Scales (MiDAS).
McDermott, Orii; Orrell, Martin; Ridder, Hanne Mette
2015-07-03
There is a need to develop an outcome measure specific to music therapy in dementia that reflects a holistic picture of the therapy process and outcome. This study aimed to develop a clinically relevant and scientifically robust music therapy outcome measure incorporating the values and views of people with dementia. Focus groups and interviews were conducted to obtain qualitative data on what music meant to people with dementia and the observed effects of music. Expert and peer consultations were conducted at each stage of the measure development to maximise its content validity. The new measure was field-tested by clinicians in a care home. Feedback from the clinicians and music therapy experts were incorporated during the review and refinement process of the measure. A review of the existing literature, the experiential results and the consensus process enabled the development of the new outcome measure "Music in Dementia Assessment Scales (MiDAS)". Analysis of the qualitative data identified five key areas of the impact of music on people with dementia and they were transformed as the five Visual Analogue Scale (VAS) items: levels of Interest, Response, Initiation, Involvement and Enjoyment. MiDAS comprises the five VAS items and a supplementary checklist of notable positive and negative reactions from the individual. This study demonstrates that it is possible to design and develop an easy to apply and rigorous quantitative outcome measure which has a high level of clinical relevance for people with dementia, care home staff and music therapists.
Assessing the Process of Retirement: a Cross-Cultural Review of Available Measures.
Rafalski, Julia C; Noone, Jack H; O'Loughlin, Kate; de Andrade, Alexsandro L
2017-06-01
Retirement research is now expanding beyond the post-World War II baby boomers' retirement attitudes and plans to include the nature of their workforce exit and how successfully they adjust to their new life. These elements are collectively known as the process of retirement. However, there is insufficient research in developing countries to inform the management of their ageing populations regarding this process. This review aims to facilitate national and cross-cultural research in developing and non-English speaking countries by reviewing the existing measures of the retirement process published in English and Portuguese. The review identified 28 existing measures assessing retirement attitudes, planning, decision making, adjustment and satisfaction with retirement. Information on each scale's item structure, internal reliability, grammatical structure and evidence of translations to other languages is presented. Of the 28 measures, 20 assessed retirement attitudes, plans and decision-making, 5 assessed adjustment to retirement and only two assessed retirement satisfaction. Only eight of the 28 scales had been translated into languages other than English. There is scope to translate measures of retirement attitudes and planning into other languages. However there is a paucity of translated measures of retirement decision-making and adjustment, and measures of retirement satisfaction in general. Within the limitations of this review, researchers are provided with the background to decide between translating existing measures or developing of more culturally appropriate assessment tools for addressing their research questions.
NASA Astrophysics Data System (ADS)
Bechtler, Laurie; Velidandla, Vamsi
2003-04-01
In response to demand for higher volumes and greater product capability, integrated optoelectronic device processing is rapidly increasing in complexity, benefiting from techniques developed for conventional silicon integrated circuit processing. The needs for high product yield and low manufacturing cost are also similar to the silicon wafer processing industry. This paper discusses the design and use of an automated inspection instrument called the Optical Surface Analyzer (OSA) to evaluate two critical production issues in optoelectronic device manufacturing: (1) film thickness uniformity, and (2) defectivity at various process steps. The OSA measurement instrument is better suited to photonics process development than most equipment developed for conventional silicon wafer processing in two important ways: it can handle both transparent and opaque substrates (unlike most inspection and metrology tools), and it is a full-wafer inspection method that captures defects and film variations over the entire substrate surface (unlike most film thickness measurement tools). Measurement examples will be provided in the paper for a variety of films and substrates used for optoelectronics manufacturing.
Rodriguez, Hayley; Kissell, Kellie; Lucas, Lloyd; Fisak, Brian
2017-11-01
Although negative beliefs have been found to be associated with worry symptoms and depressive rumination, negative beliefs have yet to be examined in relation to post-event processing and social anxiety symptoms. The purpose of the current study was to examine the psychometric properties of the Negative Beliefs about Post-Event Processing Questionnaire (NB-PEPQ). A large, non-referred undergraduate sample completed the NB-PEPQ along with validation measures, including a measure of post-event processing and social anxiety symptoms. Based on factor analysis, a single-factor model was obtained, and the NB-PEPQ was found to exhibit good validity, including positive associations with measures of post-event processing and social anxiety symptoms. These findings add to the literature on the metacognitive variables that may lead to the development and maintenance of post-event processing and social anxiety symptoms, and have relevant clinical applications.
NASA Technical Reports Server (NTRS)
Butler, C. M.; Hogge, J. E.
1978-01-01
Air quality sampling was conducted. Data for air quality parameters, recorded on written forms, punched cards or magnetic tape, are available for 1972 through 1975. Computer software was developed to (1) calculate several daily statistical measures of location, (2) plot time histories of data or the calculated daily statistics, (3) calculate simple correlation coefficients, and (4) plot scatter diagrams. Computer software was developed for processing air quality data to include time series analysis and goodness of fit tests. Computer software was developed to (1) calculate a larger number of daily statistical measures of location, and a number of daily monthly and yearly measures of location, dispersion, skewness and kurtosis, (2) decompose the extended time series model and (3) perform some goodness of fit tests. The computer program is described, documented and illustrated by examples. Recommendations are made for continuation of the development of research on processing air quality data.
ERIC Educational Resources Information Center
Kuppen, Sarah; Huss, Martina; Fosker, Tim; Fegan, Natasha; Goswami, Usha
2011-01-01
We explore the relationships between basic auditory processing, phonological awareness, vocabulary, and word reading in a sample of 95 children, 55 typically developing children, and 40 children with low IQ. All children received nonspeech auditory processing tasks, phonological processing and literacy measures, and a receptive vocabulary task.…
Climate Action Planning Process | Climate Neutral Research Campuses | NREL
Action Planning Process Climate Action Planning Process For research campuses, NREL has developed a five-step process to develop and implement climate action plans: Determine baseline energy consumption Analyze technology options Prepare a plan and set priorities Implement the climate action plan Measure and
DOE Office of Scientific and Technical Information (OSTI.GOV)
Holcomb, David Eugene
2015-01-01
Fluoride salt-cooled High temperature Reactors (FHRs) are entering into early phase engineering development. Initial candidate technologies have been identified to measure all of the required process variables. The purpose of this paper is to describe the proposed measurement techniques in sufficient detail to enable assessment of the proposed instrumentation suite and to support development of the component technologies. This paper builds upon the instrumentation chapter of the recently published FHR technology development roadmap. Locating instruments outside of the intense core radiation and high-temperature fluoride salt environment significantly decreases their environmental tolerance requirements. Under operating conditions, FHR primary coolant salt ismore » a transparent, low-vapor-pressure liquid. Consequently, FHRs can employ standoff optical measurements from above the salt pool to assess in-vessel conditions. For example, the core outlet temperature can be measured by observing the fuel s blackbody emission. Similarly, the intensity of the core s Cerenkov glow indicates the fission power level. Short-lived activation of the primary coolant provides another means for standoff measurements of process variables. The primary coolant flow and neutron flux can be measured using gamma spectroscopy along the primary coolant piping. FHR operation entails a number of process measurements. Reactor thermal power and core reactivity are the most significant variables for process control. Thermal power can be determined by measuring the primary coolant mass flow rate and temperature rise across the core. The leading candidate technologies for primary coolant temperature measurement are Au-Pt thermocouples and Johnson noise thermometry. Clamp-on ultrasonic flow measurement, that includes high-temperature tolerant standoffs, is a potential coolant flow measurement technique. Also, the salt redox condition will be monitored as an indicator of its corrosiveness. Both electrochemical techniques and optical spectroscopy are candidate fluoride salt redox measurement methods. Coolant level measurement can be performed using radar-level gauges located in standpipes above the reactor vessel. While substantial technical development remains for most of the instruments, industrially compatible instruments based upon proven technology can be reasonably extrapolated from the current state of the art.« less
Mander, Johannes
2015-01-01
There is a dearth of measures specifically designed to assess empirically validated mechanisms of therapeutic change. To fill in this research gap, the aim of the current study was to develop a measure that covers a large variety of empirically validated mechanisms of change with corresponding versions for the patient and therapist. To develop an instrument that is based on several important change process frameworks, we combined two established change mechanisms instruments: the Scale for the Multiperspective Assessment of General Change Mechanisms in Psychotherapy (SACiP) and the Scale of the Therapeutic Alliance-Revised (STA-R). In our study, 457 psychosomatic inpatients completed the SACiP and the STA-R and diverse outcome measures in early, middle and late stages of psychotherapy. Data analyses were conducted using factor analyses and multilevel modelling. The psychometric properties of the resulting Individual Therapy Process Questionnaire were generally good to excellent, as demonstrated by (a) exploratory factor analyses on both patient and therapist ratings, (b) CFA on later measuring times, (c) high internal consistencies and (d) significant outcome predictive effects. The parallel forms of the ITPQ deliver opportunities to compare the patient and therapist perspectives for a broader range of facets of change mechanisms than was hitherto possible. Consequently, the measure can be applied in future research to more specifically analyse different change mechanism profiles in session-to-session development and outcome prediction. Key Practitioner Message This article describes the development of an instrument that measures general mechanisms of change in psychotherapy from both the patient and therapist perspectives. Post-session item ratings from both the patient and therapist can be used as feedback to optimize therapeutic processes. We provide a detailed discussion of measures developed to evaluate therapeutic change mechanisms. Copyright © 2014 John Wiley & Sons, Ltd.
The Development of Model for Measuring Railway Wheels Manufacturing Readiness Level
NASA Astrophysics Data System (ADS)
Inrawan Wiratmadja, Iwan; Mufid, Anas
2016-02-01
In an effort to grow the railway wheel industry in Indonesia and reduce the dependence on imports, Metal Industries Development Center (MIDC) makes the implementation of the railway wheel manufacturing technology in Indonesia. MIDC is an institution based on research and development having a task to research the production of railway wheels prototype and acts as a supervisor to the industry in Indonesia, for implementing the railway wheel manufacturing technology. The process of implementing manufacturing technology requires a lot of resources. Therefore it is necessary to measure the manufacturing readiness process. Measurement of railway wheels manufacturing readiness was in this study done using the manufacturing readiness level (MRL) model from the United States Department of Defense. MRL consists of 10 manufacturing readiness levels described by 90 criteria and 184 sub-criteria. To get a manufacturing readiness measurement instrument that is good and accurate, the development process involved experts through expert judgment method and validated with a content validity ratio (CVR). Measurement instrument developed in this study consist of 448 indicators. The measurement results show that MIDC's railway wheels manufacturing readiness is at the level 4. This shows that there is a gap between the current level of manufacturing readiness owned by MIDC and manufacturing readiness levels required to achieve the program objectives, which is level 5. To achieve the program objectives at level 5, a number of actions were required to be done by MIDC. Indicators that must be improved to be able to achieve level 5 are indicators related to the cost and financing, process capability and control, quality management, workers, and manufacturing management criteria.
Developing Resident-Sensitive Quality Measures: A Model From Pediatric Emergency Medicine.
Schumacher, Daniel J; Holmboe, Eric S; van der Vleuten, Cees; Busari, Jamiu O; Carraccio, Carol
2017-12-05
To begin closing the gap with respect to quality measures available for use among residents, the authors sought to identify and develop resident-sensitive quality measures (RSQMs) for use in the pediatric emergency department (PED) setting. In May 2016, the authors reviewed National Quality Measures Clearinghouse (NQMC) measures to identify resident-sensitive measures. To create additional measures focused on common, acute illnesses (acute asthma exacerbation, bronchiolitis, closed head injury [CHI]) in the PED, the authors used a nominal group technique (NGT) and Delphi process from September to December 2016. To achieve a local focus for developing these measures, all NGT and Delphi participants were from Cincinnati Children's Hospital Medical Center. Delphi participants rated measures developed through the NGT in two areas: importance of measure to quality care and likelihood that measure represents the work of a resident. The review of NQMC measures identified 28 of 183 as being potentially resident-sensitive. The NGT produced 67 measures for asthma, 46 for bronchiolitis, and 48 for CHI. These were used in the first round of the Delphi process. After two rounds, 18 measures for asthma, 21 for bronchiolitis, and 22 for CHI met automatic inclusion criteria. In round three, participants categorized the potential final measures by their top 10 and next 5. This study describes a template for identifying and developing RSQMs that may promote high-quality care delivery during and following training. Next steps should include implementing and seeking validity evidence for the locally developed measures.
Submillimeter Spectroscopic Study of Semiconductor Processing Plasmas
NASA Astrophysics Data System (ADS)
Helal, Yaser H.
Plasmas used for manufacturing processes of semiconductor devices are complex and challenging to characterize. The development and improvement of plasma processes and models rely on feedback from experimental measurements. Current diagnostic methods are not capable of measuring absolute densities of plasma species with high resolution without altering the plasma, or without input from other measurements. At pressures below 100 mTorr, spectroscopic measurements of rotational transitions in the submillimeter/terahertz (SMM) spectral region are narrow enough in relation to the sparsity of spectral lines that absolute specificity of measurement is possible. The frequency resolution of SMM sources is such that spectral absorption features can be fully resolved. Processing plasmas are a similar pressure and temperature to the environment used to study astrophysical species in the SMM spectral region. Many of the molecular neutrals, radicals, and ions present in processing plasmas have been studied in the laboratory and their absorption spectra have been cataloged or are in the literature for the purpose of astrophysical study. Recent developments in SMM devices have made its technology commercially available for applications outside of specialized laboratories. The methods developed over several decades in the SMM spectral region for these laboratory studies are directly applicable for diagnostic measurements in the semiconductor manufacturing industry. In this work, a continuous wave, intensity calibrated SMM absorption spectrometer was developed as a remote sensor of gas and plasma species. A major advantage of intensity calibrated rotational absorption spectroscopy is its ability to determine absolute concentrations and temperatures of plasma species from first principles without altering the plasma environment. An important part of this work was the design of the optical components which couple 500 - 750 GHz radiation through a commercial inductively coupled plasma chamber. The measurement of transmission spectra was simultaneously fit for background and absorption signal. The measured absorption signal was used to calculate absolute densities and temperatures of polar species. Measurements of molecular species were demonstrated for inductively coupled plasmas.
Advanced in-line metrology strategy for self-aligned quadruple patterning
NASA Astrophysics Data System (ADS)
Chao, Robin; Breton, Mary; L'herron, Benoit; Mendoza, Brock; Muthinti, Raja; Nelson, Florence; De La Pena, Abraham; Le, Fee li; Miller, Eric; Sieg, Stuart; Demarest, James; Gin, Peter; Wormington, Matthew; Cepler, Aron; Bozdog, Cornel; Sendelbach, Matthew; Wolfling, Shay; Cardinal, Tom; Kanakasabapathy, Sivananda; Gaudiello, John; Felix, Nelson
2016-03-01
Self-Aligned Quadruple Patterning (SAQP) is a promising technique extending the 193-nm lithography to manufacture structures that are 20nm half pitch or smaller. This process adopts multiple sidewall spacer image transfers to split a rather relaxed design into a quarter of its original pitch. Due to the number of multiple process steps required for the pitch splitting in SAQP, the process error propagates through each deposition and etch, and accumulates at the final step into structure variations, such as pitch walk and poor critical dimension uniformity (CDU). They can further affect the downstream processes and lower the yield. The impact of this error propagation becomes significant for advanced technology nodes when the process specifications of device design CD requirements are at nanometer scale. Therefore, semiconductor manufacturing demands strict in-line process control to ensure a high process yield and improved performance, which must rely on precise measurements to enable corrective actions and quick decision making for process development. This work aims to provide a comprehensive metrology solution for SAQP. During SAQP process development, the challenges in conventional in-line metrology techniques start to surface. For instance, critical-dimension scanning electron microscopy (CDSEM) is commonly the first choice for CD and pitch variation control. However, it is found that the high aspect ratio at mandrel level processes and the trench variations after etch prevent the tool from extracting the true bottom edges of the structure in order to report the position shift. On the other hand, while the complex shape and variations can be captured with scatterometry, or optical CD (OCD), the asymmetric features, such as pitch walk, show low sensitivity with strong correlations in scatterometry. X-ray diffraction (XRD) is known to provide useful direct measurements of the pitch walk in crystalline arrays, yet the data analysis is influenced by the incoming geometry and must be used carefully. A successful implementation of SAQP process control for yield improvement requires the metrology issues to be addressed. By optimizing the measurement parameters and beam configurations, CDSEM measurements distinguish each of the spaces corresponding to the upstream mandrel processes and report their CDs separately to feed back to the process team for the next development cycle. We also utilize the unique capability in scatterometry to measure the structure details in-line and implement a "predictive" process control, which shows a good correlation between the "predictive" measurement and the cross-sections from our design of experiments (DOE). The ability to measure the pitch walk in scatterometry was also demonstrated. This work also explored the frontier of in-line XRD capability by enabling an automatic RSM fitting on tool to output pitch walk values. With these advances in metrology development, we are able to demonstrate the impacts of in-line monitoring in the SAQP process, to shorten the patterning development learning cycle to improve the yield.
Development of Data Processing and Analysis Tools for Atmospheric Radiation Measurements
NASA Technical Reports Server (NTRS)
Guillet, N.; Stassinopoulos, E. G.; Stauffer, C. A.; Dumas, M.; Palau, J.-M.; Calvet, M.-C.
2001-01-01
This paper reports on the data processing methods and techniques of measurements made by several miniature radiation spectrometers flying on different types of carriers within the Earth's atmosphere at aviation and balloon altitudes.
An Adaptive Kalman Filter using a Simple Residual Tuning Method
NASA Technical Reports Server (NTRS)
Harman, Richard R.
1999-01-01
One difficulty in using Kalman filters in real world situations is the selection of the correct process noise, measurement noise, and initial state estimate and covariance. These parameters are commonly referred to as tuning parameters. Multiple methods have been developed to estimate these parameters. Most of those methods such as maximum likelihood, subspace, and observer Kalman Identification require extensive offline processing and are not suitable for real time processing. One technique, which is suitable for real time processing, is the residual tuning method. Any mismodeling of the filter tuning parameters will result in a non-white sequence for the filter measurement residuals. The residual tuning technique uses this information to estimate corrections to those tuning parameters. The actual implementation results in a set of sequential equations that run in parallel with the Kalman filter. Equations for the estimation of the measurement noise have also been developed. These algorithms are used to estimate the process noise and measurement noise for the Wide Field Infrared Explorer star tracker and gyro.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Takahashi, Shunji; Katagiri Engineering Co., Ltd., 3-5-34 Shitte Tsurumi-ku, Yokohama 230-0003; Takashima, Seigo
2009-09-01
Atomic radicals such as hydrogen (H) and oxygen (O) play important roles in process plasmas. In a previous study, we developed a system for measuring the absolute density of H, O, nitrogen, and carbon atoms in plasmas using vacuum ultraviolet absorption spectroscopy (VUVAS) with a compact light source using an atmospheric pressure microplasma [microdischarge hollow cathode lamp (MHCL)]. In this study, we developed a monitoring probe for atomic radicals employing the VUVAS with the MHCL. The probe size was 2.7 mm in diameter. Using this probe, only a single port needs to be accessed for radical density measurements. We successfullymore » measured the spatial distribution of the absolute densities of H and O atomic radicals in a radical-based plasma processing system by moving the probe along the radial direction of the chamber. This probe allows convenient analysis of atomic radical densities to be carried out for any type of process plasma at any time. We refer to this probe as a ubiquitous monitoring probe for atomic radicals.« less
Nickel-Phosphorous Development for Total Solar Irradiance Measurement
NASA Astrophysics Data System (ADS)
Carlesso, F.; Berni, L. A.; Vieira, L. E. A.; Savonov, G. S.; Nishimori, M.; Dal Lago, A.; Miranda, E.
2017-10-01
The development of an absolute radiometer instrument is currently a effort at INPE for TSI measurements. In this work, we describe the development of black Ni-P coatings for TSI radiometers absorptive cavities. We present a study of the surface blackening process and the relationships between morphological structure, chemical composition and coating absorption. Ni-P deposits with different phosphorous content were obtained by electroless techniques on aluminum substrates with a thin zincate layer. Appropriate phosphorus composition and etching parameters process produce low reflectance black coatings.
The Development of a Practical and Reliable Assessment Measure for Atopic Dermatitis (ADAM).
ERIC Educational Resources Information Center
Charman, Denise; Varigos, George; Horne, David J. de L.; Oberklaid, Frank
1999-01-01
A study was conducted in Australia to develop a reliable, valid, and practical measure of atopic dermatitis. The test development process and validity evaluation with two doctors and 51 patients are discussed. Results suggest that operational definitions of the scales need to be defined more clearly. The measure satisfies assumptions for a partial…
Metrology: Calibration and measurement processes guidelines
NASA Technical Reports Server (NTRS)
Castrup, Howard T.; Eicke, Woodward G.; Hayes, Jerry L.; Mark, Alexander; Martin, Robert E.; Taylor, James L.
1994-01-01
The guide is intended as a resource to aid engineers and systems contracts in the design, implementation, and operation of metrology, calibration, and measurement systems, and to assist NASA personnel in the uniform evaluation of such systems supplied or operated by contractors. Methodologies and techniques acceptable in fulfilling metrology quality requirements for NASA programs are outlined. The measurement process is covered from a high level through more detailed discussions of key elements within the process, Emphasis is given to the flowdown of project requirements to measurement system requirements, then through the activities that will provide measurements with defined quality. In addition, innovations and techniques for error analysis, development of statistical measurement process control, optimization of calibration recall systems, and evaluation of measurement uncertainty are presented.
Method for 3D noncontact measurements of cut trees package area
NASA Astrophysics Data System (ADS)
Knyaz, Vladimir A.; Vizilter, Yuri V.
2001-02-01
Progress in imaging sensors and computers create the background for numerous 3D imaging application for wide variety of manufacturing activity. Many demands for automated precise measurements are in wood branch of industry. One of them is the accurate volume definition for cut trees carried on the truck. The key point for volume estimation is determination of the front area of the cut tree package. To eliminate slow and inaccurate manual measurements being now in practice the experimental system for automated non-contact wood measurements is developed. The system includes two non-metric CCD video cameras, PC as central processing unit, frame grabbers and original software for image processing and 3D measurements. The proposed method of measurement is based on capturing the stereo pair of front of trees package and performing the image orthotranformation into the front plane. This technique allows to process transformed image for circle shapes recognition and calculating their area. The metric characteristics of the system are provided by special camera calibration procedure. The paper presents the developed method of 3D measurements, describes the hardware used for image acquisition and the software realized the developed algorithms, gives the productivity and precision characteristics of the system.
NASA Astrophysics Data System (ADS)
do Lago, Naydson Emmerson S. P.; Kardec Barros, Allan; Sousa, Nilviane Pires S.; Junior, Carlos Magno S.; Oliveira, Guilherme; Guimares Polisel, Camila; Eder Carvalho Santana, Ewaldo
2018-01-01
This study aims to develop an algorithm of an adaptive filter to determine the percentage of body fat based on the use of anthropometric indicators in adolescents. Measurements such as body mass, height and waist circumference were collected for a better analysis. The development of this filter was based on the Wiener filter, used to produce an estimate of a random process. The Wiener filter minimizes the mean square error between the estimated random process and the desired process. The LMS algorithm was also studied for the development of the filter because it is important due to its simplicity and facility of computation. Excellent results were obtained with the filter developed, being these results analyzed and compared with the data collected.
Welding process modelling and control
NASA Technical Reports Server (NTRS)
Romine, Peter L.; Adenwala, Jinen A.
1993-01-01
The research and analysis performed, and software developed, and hardware/software recommendations made during 1992 in development of the PC-based data acquisition system for support of Welding Process Modeling and Control is reported. A need was identified by the Metals Processing Branch of NASA Marshall Space Flight Center, for a mobile data aquisition and analysis system, customized for welding measurement and calibration. Several hardware configurations were evaluated and a PC-based system was chosen. The Welding Measurement System (WMS) is a dedicated instrument, strictly for the use of data aquisition and analysis. Although the WMS supports many of the functions associated with the process control, it is not the intention for this system to be used for welding process control.
An intraorganizational model for developing and spreading quality improvement innovations.
Kellogg, Katherine C; Gainer, Lindsay A; Allen, Adrienne S; OʼSullivan, Tatum; Singer, Sara J
Recent policy reforms encourage quality improvement (QI) innovations in primary care, but practitioners lack clear guidance regarding spread inside organizations. We designed this study to identify how large organizations can facilitate intraorganizational spread of QI innovations. We conducted ethnographic observation and interviews in a large, multispecialty, community-based medical group that implemented three QI innovations across 10 primary care sites using a new method for intraorganizational process development and spread. We compared quantitative outcomes achieved through the group's traditional versus new method, created a process model describing the steps in the new method, and identified barriers and facilitators at each step. The medical group achieved substantial improvement using its new method of intraorganizational process development and spread of QI innovations: standard work for rooming and depression screening, vaccine error rates and order compliance, and Pap smear error rates. Our model details nine critical steps for successful intraorganizational process development (set priorities, assess the current state, develop the new process, and measure and refine) and spread (develop support, disseminate information, facilitate peer-to-peer training, reinforce, and learn and adapt). Our results highlight the importance of utilizing preexisting organizational structures such as established communication channels, standardized roles, common workflows, formal authority, and performance measurement and feedback systems when developing and spreading QI processes inside an organization. In particular, we detail how formal process advocate positions in each site for each role can facilitate the spread of new processes. Successful intraorganizational spread is possible and sustainable. Developing and spreading new QI processes across sites inside an organization requires creating a shared understanding of the necessary process steps, considering the barriers that may arise at each step, and leveraging preexisting organizational structures to facilitate intraorganizational process development and spread.
An intraorganizational model for developing and spreading quality improvement innovations
Kellogg, Katherine C.; Gainer, Lindsay A.; Allen, Adrienne S.; O'Sullivan, Tatum; Singer, Sara J.
2017-01-01
Background: Recent policy reforms encourage quality improvement (QI) innovations in primary care, but practitioners lack clear guidance regarding spread inside organizations. Purpose: We designed this study to identify how large organizations can facilitate intraorganizational spread of QI innovations. Methodology/Approach: We conducted ethnographic observation and interviews in a large, multispecialty, community-based medical group that implemented three QI innovations across 10 primary care sites using a new method for intraorganizational process development and spread. We compared quantitative outcomes achieved through the group’s traditional versus new method, created a process model describing the steps in the new method, and identified barriers and facilitators at each step. Findings: The medical group achieved substantial improvement using its new method of intraorganizational process development and spread of QI innovations: standard work for rooming and depression screening, vaccine error rates and order compliance, and Pap smear error rates. Our model details nine critical steps for successful intraorganizational process development (set priorities, assess the current state, develop the new process, and measure and refine) and spread (develop support, disseminate information, facilitate peer-to-peer training, reinforce, and learn and adapt). Our results highlight the importance of utilizing preexisting organizational structures such as established communication channels, standardized roles, common workflows, formal authority, and performance measurement and feedback systems when developing and spreading QI processes inside an organization. In particular, we detail how formal process advocate positions in each site for each role can facilitate the spread of new processes. Practice Implications: Successful intraorganizational spread is possible and sustainable. Developing and spreading new QI processes across sites inside an organization requires creating a shared understanding of the necessary process steps, considering the barriers that may arise at each step, and leveraging preexisting organizational structures to facilitate intraorganizational process development and spread. PMID:27428788
NCTM of liquids at high temperatures using polarization techniques
NASA Technical Reports Server (NTRS)
Krishnan, Shankar; Weber, J. K. Richard; Nordine, Paul C.; Schiffman, Robert A.
1990-01-01
Temperature measurement and control is extremely important in any materials processing application. However, conventional techniques for non-contact temperature measurement (mainly optical pyrometry) are very uncertain because of unknown or varying surface emittance. Optical properties like other properties change during processing. A dynamic, in-situ measurement of optical properties including the emittance is required. Intersonics is developing new technologies using polarized laser light scattering to determine surface emittance of freely radiating bodies concurrent with conventional optical pyrometry. These are sufficient to determine the true surface temperature of the target. Intersonics is currently developing a system called DAPP, the Division of Amplitude Polarimetric Pyrometer, that uses polarization information to measure the true thermodynamic temperature of freely radiating objects. This instrument has potential use in materials processing applications in ground and space based equipment. Results of thermophysical and thermodynamic measurements using laser reflection as a temperature measuring tool are presented. The impact of these techniques on thermophysical property measurements at high temperature is discussed.
Piagetian Cognitive Development and Primary Process Thinking in Children
ERIC Educational Resources Information Center
Wulach, James S.
1977-01-01
Thirty-seven middle-class white children, ages 5-8, were tested on eight Piagetian tasks and the Rorschach test, and divided into preoperational, transitional, and concrete operational groups. Measures of primary process vs. secondary process thinking were found to be related to the Piagetian stages of development. (GDC)
NASA Astrophysics Data System (ADS)
Aldowaisan, Tariq; Allahverdi, Ali
2016-05-01
This paper describes the process of developing programme educational objectives (PEOs) for the Industrial and Management Systems Engineering programme at Kuwait University, and the process of deployment of these PEOs. Input of the four constituents of the programme, faculty, students, alumni, and employers, is incorporated in the development and update of the PEOs. For each PEO an assessment process is employed where performance measures are defined along with target attainment levels. Results from assessment tools are compared with the target attainment levels to measure performance with regard to the PEOs. The assessment indicates that the results meet or exceed the target attainment levels of the PEOs' performance measures.
Applications of High-speed motion analysis system on Solid Rocket Motor (SRM)
NASA Astrophysics Data System (ADS)
Liu, Yang; He, Guo-qiang; Li, Jiang; Liu, Pei-jin; Chen, Jian
2007-01-01
High-speed motion analysis system could record images up to 12,000fps and analyzed with the image processing system. The system stored data and images directly in electronic memory convenient for managing and analyzing. The high-speed motion analysis system and the X-ray radiography system were established the high-speed real-time X-ray radiography system, which could diagnose and measure the dynamic and high-speed process in opaque. The image processing software was developed for improve quality of the original image for acquiring more precise information. The typical applications of high-speed motion analysis system on solid rocket motor (SRM) were introduced in the paper. The research of anomalous combustion of solid propellant grain with defects, real-time measurement experiment of insulator eroding, explosion incision process of motor, structure and wave character of plume during the process of ignition and flameout, measurement of end burning of solid propellant, measurement of flame front and compatibility between airplane and missile during the missile launching were carried out using high-speed motion analysis system. The significative results were achieved through the research. Aim at application of high-speed motion analysis system on solid rocket motor, the key problem, such as motor vibrancy, electrical source instability, geometry aberrance, and yawp disturbance, which damaged the image quality, was solved. The image processing software was developed which improved the capability of measuring the characteristic of image. The experimental results showed that the system was a powerful facility to study instantaneous and high-speed process in solid rocket motor. With the development of the image processing technique, the capability of high-speed motion analysis system was enhanced.
ERIC Educational Resources Information Center
Miller, John
1994-01-01
Presents an approach to document numbering, document titling, and process measurement which, when used with fundamental techniques of statistical process control, reveals meaningful process-element variation as well as nominal productivity models. (SR)
Xiao, Ting; Stamatakis, Katherine A; McVay, Allese B
Local health departments (LHDs) have an important function in controlling the growing epidemic of obesity in the United States. Data are needed to gain insight into the existence of routine functions and structures of LHDs that support and sustain obesity prevention efforts. The purpose of this study was to develop and examine the reliability of measures to assess foundational LHD organizational processes and functions specific to obesity prevention. Survey measures were developed using a stratified, random sample of US LHDs to assess supportive organizational processes and infrastructure for obesity prevention representing different domains. Data were analyzed using weighted κ and intraclass correlation coefficient for assessing test-retest reliability. Most items and summary indices in the majority of survey domains had moderate/substantial or almost perfect reliability. The overall findings support this survey instrument to be a reliable measurement tool for a large number of processes and functions that comprise obesity prevention-related capacity in LHDs.
Latest Developments in the Matrics Process
Green, Michael Foster; Nuechterlein, Keith H
2010-01-01
The Measurement and Treatment Research to Improve Cognition in Schizophrenia Research process has led to several developments in the assessment of cognitive functioning for schizophrenia-treatment studies. The first development was the development of a consensus cognitive battery and a United States Food and Drug Administration-endorsed research design. Since the development of the cognitive battery, interest has been spurred in clinical trials in different countries and the development of co-primary functional outcomes measures for these. The MATRICS Consensus Cognitive Battery has been translated into 11 different languages and is being translated into even more. A study has been completed that compared the usefulness of multiple potential co-primary measures, suggesting that the University of California San Diego Performance-Based skills assessment, version II (UPSA-II) is the most suitable for studies conducted in English. These findings suggest that reliable performance-based measures that are easy to administer and highly correlated with cognitive functioning are now available for use in treatment studies. PMID:20622946
2013-01-01
Background Several measurement tools have been developed to measure health literacy. The tools vary in their approach and design, but few have focused on comprehensive health literacy in populations. This paper describes the design and development of the European Health Literacy Survey Questionnaire (HLS-EU-Q), an innovative, comprehensive tool to measure health literacy in populations. Methods Based on a conceptual model and definition, the process involved item development, pre-testing, field-testing, external consultation, plain language check, and translation from English to Bulgarian, Dutch, German, Greek, Polish, and Spanish. Results The development process resulted in the HLS-EU-Q, which entailed two sections, a core health literacy section and a section on determinants and outcomes associated to health literacy. The health literacy section included 47 items addressing self-reported difficulties in accessing, understanding, appraising and applying information in tasks concerning decisions making in healthcare, disease prevention, and health promotion. The second section included items related to, health behaviour, health status, health service use, community participation, socio-demographic and socio-economic factors. Conclusions By illuminating the detailed steps in the design and development process of the HLS-EU-Q, it is the aim to provide a deeper understanding of its purpose, its capability and its limitations for others using the tool. By stimulating a wide application it is the vision that HLS-EU-Q will be validated in more countries to enhance the understanding of health literacy in different populations. PMID:24112855
Design and Development of a Real-Time Model Attitude Measurement System for Hypersonic Facilities
NASA Technical Reports Server (NTRS)
Jones, Thomas W.; Lunsford, Charles B.
2005-01-01
A series of wind tunnel tests have been conducted to evaluate a multi-camera videogrammetric system designed to measure model attitude in hypersonic facilities. The technique utilizes processed video data and applies photogrammetric principles for point tracking to compute model position including pitch, roll and yaw variables. A discussion of the constraints encountered during the design, development, and testing process, including lighting, vibration, operational range and optical access is included. Initial measurement results from the NASA Langley Research Center (LaRC) 31-Inch Mach 10 tunnel are presented.
Design and Development of a Real-Time Model Attitude Measurement System for Hypersonic Facilities
NASA Technical Reports Server (NTRS)
Jones, Thomas W.; Lunsford, Charles B.
2004-01-01
A series of wind tunnel tests have been conducted to evaluate a multi-camera videogrammetric system designed to measure model attitude in hypersonic facilities. The technique utilizes processed video data and applies photogrammetric principles for point tracking to compute model position including pitch, roll and yaw variables. A discussion of the constraints encountered during the design, development, and testing process, including lighting, vibration, operational range and optical access is included. Initial measurement results from the NASA Langley Research Center (LaRC) 31-Inch Mach 10 tunnel are presented.
Thermographic Measurements of the Commercial Laser Powder Bed Fusion Process at NIST.
Lane, Brandon; Moylan, Shawn; Whitenton, Eric; Ma, Li
2016-01-01
Measurement of the high-temperature melt pool region in the laser powder bed fusion (L-PBF) process is a primary focus of researchers to further understand the dynamic physics of the heating, melting, adhesion, and cooling which define this commercially popular additive manufacturing process. This paper will detail the design, execution, and results of high speed, high magnification in-situ thermographic measurements conducted at the National Institute of Standards and Technology (NIST) focusing on the melt pool region of a commercial L-PBF process. Multiple phenomena are observed including plasma plume and hot particle ejection from the melt region. The thermographic measurement process will be detailed with emphasis on the 'measurability' of observed phenomena and the sources of measurement uncertainty. Further discussion will relate these thermographic results to other efforts at NIST towards L-PBF process finite element simulation and development of in-situ sensing and control methodologies.
Psychometric evaluation of the English version of the Extended Post-event Processing Questionnaire.
Wong, Quincy J J
2015-01-01
The importance of post-event processing (PEP) in prominent models of social anxiety disorder has led to the development of measures that tap this cognitive construct. The 17-item Extended Post-event Processing Questionnaire (E-PEPQ) is one of the most comprehensive measures of PEP developed to date. However, the measure was developed in German and the psychometric properties of the English version of the E-PEPQ have not yet been examined. The current study examined the factor structure, internal consistency, and construct validity of the English version of the E-PEPQ. English-speaking participants (N = 560) completed the English version of the E-PEPQ, a measure of social anxiety and a measure of depression. A 15-item version of the E-PEPQ with a correlated three-factor structure (referred to as the E-PEPQ-15) emerged as the best fitting model using confirmatory factor analyses, and the E-PEPQ-15 and its subscales demonstrated good internal consistency. The E-PEPQ-15 and two of its three subscales also had significantly stronger positive associations with the social anxiety measure than with the depression measure. The psychometric properties of the E-PEPQ-15 obtained in the current study justify the use of the measure in research, particularly in the domain of social anxiety.
Quality of Care Measures for the Management of Unhealthy Alcohol Use
Hepner, Kimberly A.; Watkins, Katherine E.; Farmer, Carrie M.; Rubenstein, Lisa; Pedersen, Eric R.; Pincus, Harold Alan
2017-01-01
There is a paucity of quality measures to assess the care for the range of unhealthy alcohol use, ranging from risky drinking to alcohol use disorders. Using a two-phase expert panel review process, we sought to develop an expanded set of quality of care measures for unhealthy alcohol use, focusing on outpatient care delivered in both primary care and specialty care settings. This process generated 25 candidate measures. Eight measures address screening and assessment, 11 address aspects of treatment, and six address follow-up. These quality measures represent high priority targets for future development, including creating detailed technical specifications and pilot testing them to evaluate their utility in terms of feasibility, reliability, and validity. PMID:28340902
System enhancements of Mesoscale Analysis and Space Sensor (MASS) computer system
NASA Technical Reports Server (NTRS)
Hickey, J. S.; Karitani, S.
1985-01-01
The interactive information processing for the mesoscale analysis and space sensor (MASS) program is reported. The development and implementation of new spaceborne remote sensing technology to observe and measure atmospheric processes is described. The space measurements and conventional observational data are processed together to gain an improved understanding of the mesoscale structure and dynamical evolution of the atmosphere relative to cloud development and precipitation processes. A Research Computer System consisting of three primary computers was developed (HP-1000F, Perkin-Elmer 3250, and Harris/6) which provides a wide range of capabilities for processing and displaying interactively large volumes of remote sensing data. The development of a MASS data base management and analysis system on the HP-1000F computer and extending these capabilities by integration with the Perkin-Elmer and Harris/6 computers using the MSFC's Apple III microcomputer workstations is described. The objectives are: to design hardware enhancements for computer integration and to provide data conversion and transfer between machines.
Mariani, Rachele; Maskit, Bernard; Bucci, Wilma; De Coro, Alessandra
2013-01-01
The referential process is defined in the context of Bucci's multiple code theory as the process by which nonverbal experience is connected to language. The English computerized measures of the referential process, which have been applied in psychotherapy research, include the Weighted Referential Activity Dictionary (WRAD), and measures of Reflection, Affect and Disfluency. This paper presents the development of the Italian version of the IWRAD by modeling Italian texts scored by judges, and shows the application of the IWRAD and other Italian measures in three psychodynamic treatments evaluated for personality change using the Shedler-Westen Assessment Procedure (SWAP-200). Clinical predictions based on applications of the English measures were supported.
NASA Astrophysics Data System (ADS)
Kuntoro, Hadiyan Yusuf; Hudaya, Akhmad Zidni; Dinaryanto, Okto; Majid, Akmal Irfan; Deendarlianto
2016-06-01
Due to the importance of the two-phase flow researches for the industrial safety analysis, many researchers developed various methods and techniques to study the two-phase flow phenomena on the industrial cases, such as in the chemical, petroleum and nuclear industries cases. One of the developing methods and techniques is image processing technique. This technique is widely used in the two-phase flow researches due to the non-intrusive capability to process a lot of visualization data which are contain many complexities. Moreover, this technique allows to capture direct-visual information data of the flow which are difficult to be captured by other methods and techniques. The main objective of this paper is to present an improved algorithm of image processing technique from the preceding algorithm for the stratified flow cases. The present algorithm can measure the film thickness (hL) of stratified flow as well as the geometrical properties of the interfacial waves with lower processing time and random-access memory (RAM) usage than the preceding algorithm. Also, the measurement results are aimed to develop a high quality database of stratified flow which is scanty. In the present work, the measurement results had a satisfactory agreement with the previous works.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kuntoro, Hadiyan Yusuf, E-mail: hadiyan.y.kuntoro@mail.ugm.ac.id; Majid, Akmal Irfan; Deendarlianto, E-mail: deendarlianto@ugm.ac.id
Due to the importance of the two-phase flow researches for the industrial safety analysis, many researchers developed various methods and techniques to study the two-phase flow phenomena on the industrial cases, such as in the chemical, petroleum and nuclear industries cases. One of the developing methods and techniques is image processing technique. This technique is widely used in the two-phase flow researches due to the non-intrusive capability to process a lot of visualization data which are contain many complexities. Moreover, this technique allows to capture direct-visual information data of the flow which are difficult to be captured by other methodsmore » and techniques. The main objective of this paper is to present an improved algorithm of image processing technique from the preceding algorithm for the stratified flow cases. The present algorithm can measure the film thickness (h{sub L}) of stratified flow as well as the geometrical properties of the interfacial waves with lower processing time and random-access memory (RAM) usage than the preceding algorithm. Also, the measurement results are aimed to develop a high quality database of stratified flow which is scanty. In the present work, the measurement results had a satisfactory agreement with the previous works.« less
ERIC Educational Resources Information Center
Wright, John C.; And Others
A conceptual model of how children process televised information was developed with the goal of identifying those parameters of the process that are both measurable and manipulable in research settings. The model presented accommodates the nature of information processing both by the child and by the presentation by the medium. Presentation is…
Developing questionnaires for educational research: AMEE Guide No. 87
La Rochelle, Jeffrey S.; Dezee, Kent J.; Gehlbach, Hunter
2014-01-01
In this AMEE Guide, we consider the design and development of self-administered surveys, commonly called questionnaires. Questionnaires are widely employed in medical education research. Unfortunately, the processes used to develop such questionnaires vary in quality and lack consistent, rigorous standards. Consequently, the quality of the questionnaires used in medical education research is highly variable. To address this problem, this AMEE Guide presents a systematic, seven-step process for designing high-quality questionnaires, with particular emphasis on developing survey scales. These seven steps do not address all aspects of survey design, nor do they represent the only way to develop a high-quality questionnaire. Instead, these steps synthesize multiple survey design techniques and organize them into a cohesive process for questionnaire developers of all levels. Addressing each of these steps systematically will improve the probabilities that survey designers will accurately measure what they intend to measure. PMID:24661014
Developing questionnaires for educational research: AMEE Guide No. 87.
Artino, Anthony R; La Rochelle, Jeffrey S; Dezee, Kent J; Gehlbach, Hunter
2014-06-01
In this AMEE Guide, we consider the design and development of self-administered surveys, commonly called questionnaires. Questionnaires are widely employed in medical education research. Unfortunately, the processes used to develop such questionnaires vary in quality and lack consistent, rigorous standards. Consequently, the quality of the questionnaires used in medical education research is highly variable. To address this problem, this AMEE Guide presents a systematic, seven-step process for designing high-quality questionnaires, with particular emphasis on developing survey scales. These seven steps do not address all aspects of survey design, nor do they represent the only way to develop a high-quality questionnaire. Instead, these steps synthesize multiple survey design techniques and organize them into a cohesive process for questionnaire developers of all levels. Addressing each of these steps systematically will improve the probabilities that survey designers will accurately measure what they intend to measure.
An improved device to measure cottonseed strength
USDA-ARS?s Scientific Manuscript database
During processing, seeds of cotton cultivars with fragile seeds often break and produce seed coat fragments that can cause processing problems at textile mills. A cottonseed shear tester, previously developed to measure cottonseed strength, was modified with enhancements to the drive system to provi...
Validity and Reliability Study of the Turkish Version of Ego Identity Process Questionairre
ERIC Educational Resources Information Center
Morsünbül, Ümit; Atak, Hasan
2013-01-01
The main developmental task is identity development in adolescence period. Marcia defined four identity statuses based on exploration and commitment process: Achievement, moratorium, foreclosure and diffusion. Certain scales were developed to measure identity development. Another questionnaire that evaluates both four identity statuses and the…
Understanding Acceptance of Software Metrics--A Developer Perspective
ERIC Educational Resources Information Center
Umarji, Medha
2009-01-01
Software metrics are measures of software products and processes. Metrics are widely used by software organizations to help manage projects, improve product quality and increase efficiency of the software development process. However, metrics programs tend to have a high failure rate in organizations, and developer pushback is one of the sources…
Real-time optical fiber digital speckle pattern interferometry for industrial applications
NASA Astrophysics Data System (ADS)
Chan, Robert K.; Cheung, Y. M.; Lo, C. H.; Tam, T. K.
1997-03-01
There is current interest, especially in the industrial sector, to use the digital speckle pattern interferometry (DSPI) technique to measure surface stress. Indeed, many publications in the subject are evident of the growing interests in the field. However, to bring the technology to industrial use requires the integration of several emerging technologies, viz. optics, feedback control, electronics, imaging processing and digital signal processing. Due to the highly interdisciplinary nature of the technique, successful implementation and development require expertise in all of the fields. At Baptist University, under the funding of a major industrial grant, we are developing the technology for the industrial sector. Our system fully exploits optical fibers and diode lasers in the design to enable practical and rugged systems suited for industrial applications. Besides the development in optics, we have broken away from the reliance of a microcomputer PC platform for both image capture and processing, and have developed a digital signal processing array system that can handle simultaneous and independent image capture/processing with feedback control. The system, named CASPA for 'cascadable architecture signal processing array,' is a third generation development system that utilizes up to 7 digital signal processors has proved to be a very powerful system. With our CASPA we are now in a better position to developing novel optical measurement systems for industrial application that may require different measurement systems to operate concurrently and requiring information exchange between the systems. Applications in mind such as simultaneous in-plane and out-of-plane DSPI image capture/process, vibrational analysis with interactive DSPI and phase shifting control of optical systems are a few good examples of the potentials.
Toomey, Russell B.; Anhalt, Karla; Shramko, Maura
2016-01-01
The processes of identity exploration and resolution are salient during adolescence and young adulthood, and awareness of sexual orientation identity, in particular, is heightened in early adolescence. Much of the research on sexual orientation identity development has focused on identity milestones (e.g., age of awareness and disclosure) or internalized homonegativity, rather than the developmental processes of exploration and resolution. Psychometric properties of the Sexual Orientation Identity Development Scale, which was adapted from a developmentally-informed measure of ethnic-racial identity, were evaluated in a sample of 382 Latina/o sexual minority adolescents and young adults. Results supported the reliability and validity of the adapted measure, as well as measurement equivalence across language (Spanish and English) and development (adolescence and young adulthood). PMID:27398072
NASA Astrophysics Data System (ADS)
Dunn, Michael
2008-10-01
For over 30 years, the Oak Ridge National Laboratory (ORNL) has performed research and development to provide more accurate nuclear cross-section data in the resonance region. The ORNL Nuclear Data (ND) Program consists of four complementary areas of research: (1) cross-section measurements at the Oak Ridge Electron Linear Accelerator; (2) resonance analysis methods development with the SAMMY R-matrix analysis software; (3) cross-section evaluation development; and (4) cross-section processing methods development with the AMPX software system. The ND Program is tightly coupled with nuclear fuel cycle analyses and radiation transport methods development efforts at ORNL. Thus, nuclear data work is performed in concert with nuclear science and technology needs and requirements. Recent advances in each component of the ORNL ND Program have led to improvements in resonance region measurements, R-matrix analyses, cross-section evaluations, and processing capabilities that directly support radiation transport research and development. Of particular importance are the improvements in cross-section covariance data evaluation and processing capabilities. The benefit of these advances to nuclear science and technology research and development will be discussed during the symposium on Nuclear Physics Research Connections to Nuclear Energy.
Laser metrology in food-related systems
NASA Astrophysics Data System (ADS)
Mendoza-Sanchez, Patricia; Lopez, Daniel; Kongraksawech, Teepakorn; Vazquez, Pedro; Torres, J. Antonio; Ramirez, Jose A.; Huerta-Ruelas, Jorge
2005-02-01
An optical system was developed using a low-cost semiconductor laser and commercial optical and electronic components, to monitor food processes by measuring changes in optical rotation (OR) of chiral compounds. The OR signal as a function of processing time and sample temperature were collected and recorded using a computer data acquisition system. System has been tested during two different processes: sugar-protein interaction and, beer fermentation process. To study sugar-protein interaction, the following sugars were used: sorbitol, trehalose and sucrose, and in the place of Protein, Serum Albumin Bovine (BSA, A-7906 Sigma-Aldrich). In some food processes, different sugars are added to protect damage of proteins during their processing, storage and/or distribution. Different sugar/protein solutions were prepared and heated above critical temperature of protein denaturation. OR measurements were performed during heating process and effect of different sugars in protein denaturation was measured. Higher sensitivity of these measurements was found compared with Differential Scanning Calorimetry, which needs higher protein concentration to study these interactions. The brewing fermentation process was monitored in-situ using this OR system and validated by correlation with specific density measurements and gas chromatography. This instrument can be implemented to monitor fermentation on-line, thereby determining end of process and optimizing process conditions in an industrial setting. The high sensitivity of developed OR system has no mobile parts and is more flexible than commercial polarimeters providing the capability of implementation in harsh environments, signifying the potential of this method as an in-line technique for quality control in food processing and for experimentation with optically active solutions.
Cultural Differences in the Development of Processing Speed
ERIC Educational Resources Information Center
Kail, Robert V.; McBride-Chang, Catherine; Ferrer, Emilio; Cho, Jeung-Ryeul; Shu, Hua
2013-01-01
The aim of the present work was to examine cultural differences in the development of speed of information processing. Four samples of US children ("N" = 509) and four samples of East Asian children ("N" = 661) completed psychometric measures of processing speed on two occasions. Analyses of the longitudinal data indicated…
Self-referenced processing, neurodevelopment and joint attention in autism.
Mundy, Peter; Gwaltney, Mary; Henderson, Heather
2010-09-01
This article describes a parallel and distributed processing model (PDPM) of joint attention, self-referenced processing and autism. According to this model, autism involves early impairments in the capacity for rapid, integrated processing of self-referenced (proprioceptive and interoceptive) and other-referenced (exteroceptive) information. Measures of joint attention have proven useful in research on autism because they are sensitive to the early development of the 'parallel' and integrated processing of self- and other-referenced stimuli. Moreover, joint attention behaviors are a consequence, but also an organizer of the functional development of a distal distributed cortical system involving anterior networks including the prefrontal and insula cortices, as well as posterior neural networks including the temporal and parietal cortices. Measures of joint attention provide early behavioral indicators of atypical development in this parallel and distributed processing system in autism. In addition it is proposed that an early, chronic disturbance in the capacity for integrating self- and other-referenced information may have cascading effects on the development of self awareness in autism. The assumptions, empirical support and future research implications of this model are discussed.
Lin, Steve; Turgulov, Anuar; Taher, Ahmed; Buick, Jason E; Byers, Adam; Drennan, Ian R; Hu, Samantha; J Morrison, Laurie
2016-10-01
Cardiopulmonary resuscitation (CPR) process measures research and quality assurance has traditionally been limited to the first 5 minutes of resuscitation due to significant costs in time, resources, and personnel from manual data abstraction. CPR performance may change over time during prolonged resuscitations, which represents a significant knowledge gap. Moreover, currently available commercial software output of CPR process measures are difficult to analyze. The objective was to develop and validate a software program to help automate the abstraction and transfer of CPR process measures data from electronic defibrillators for complete episodes of cardiac arrest resuscitation. We developed a software program to facilitate and help automate CPR data abstraction and transfer from electronic defibrillators for entire resuscitation episodes. Using an intermediary Extensible Markup Language export file, the automated software transfers CPR process measures data (electrocardiogram [ECG] number, CPR start time, number of ventilations, number of chest compressions, compression rate per minute, compression depth per minute, compression fraction, and end-tidal CO 2 per minute). We performed an internal validation of the software program on 50 randomly selected cardiac arrest cases with resuscitation durations between 15 and 60 minutes. CPR process measures were manually abstracted and transferred independently by two trained data abstractors and by the automated software program, followed by manual interpretation of raw ECG tracings, treatment interventions, and patient events. Error rates and the time needed for data abstraction, transfer, and interpretation were measured for both manual and automated methods, compared to an additional independent reviewer. A total of 9,826 data points were each abstracted by the two abstractors and by the software program. Manual data abstraction resulted in a total of six errors (0.06%) compared to zero errors by the software program. The mean ± SD time measured per case for manual data abstraction was 20.3 ± 2.7 minutes compared to 5.3 ± 1.4 minutes using the software program (p = 0.003). We developed and validated an automated software program that efficiently abstracts and transfers CPR process measures data from electronic defibrillators for complete cardiac arrest episodes. This software will enable future cardiac arrest studies and quality assurance programs to evaluate the impact of CPR process measures during prolonged resuscitations. © 2016 by the Society for Academic Emergency Medicine.
National Quality Measures for Child Mental Health Care: Background, Progress, and Next Steps
Murphy, J. Michael; Scholle, Sarah Hudson; Hoagwood, Kimberly Eaton; Sachdeva, Ramesh C.; Mangione-Smith, Rita; Woods, Donna; Kamin, Hayley S.; Jellinek, Michael
2013-01-01
OBJECTIVE: To review recent health policies related to measuring child health care quality, the selection processes of national child health quality measures, the nationally recommended quality measures for child mental health care and their evidence strength, the progress made toward developing new measures, and early lessons learned from these national efforts. METHODS: Methods used included description of the selection process of child health care quality measures from 2 independent national initiatives, the recommended quality measures for child mental health care, and the strength of scientific evidence supporting them. RESULTS: Of the child health quality measures recommended or endorsed during these national initiatives, only 9 unique measures were related to child mental health. CONCLUSIONS: The development of new child mental health quality measures poses methodologic challenges that will require a paradigm shift to align research with its accelerated pace. PMID:23457148
In Situ Fringe Projection Profilometry for Laser Power Bed Fusion Process
NASA Astrophysics Data System (ADS)
Zhang, Bin
Additive manufacturing (AM) offers an industrial solution to produce parts with complex geometries and internal structures that conventional manufacturing techniques cannot produce. However, current metal additive process, particularly the laser powder bed fusion (LPBF) process, suffers from poor surface finish and various material defects which hinder its wide applications. One way to solve this problem is by adding in situ metrology sensor onto the machine chamber. Matured manufacturing processes are tightly monitored and controlled, and instrumentation advances are needed to realize this same advantage for metal additive process. This encourages us to develop an in situ fringe projection system for the LPBF process. The development of such a system and the measurement capability are demonstrated in this dissertation. We show that this system can measure various powder bed signatures including powder layer variations, the average height drop between fused metal and unfused powder, and the height variations on the fused surfaces. The ability to measure textured surface is also evaluated through the instrument transfer function (ITF). We analyze the mathematical model of the proposed fringe projection system, and prove the linearity of the system through simulations. A practical ITF measurement technique using a stepped surface is also demonstrated. The measurement results are compared with theoretical predictions generated through the ITF simulations.
A Focusing Method in the Calibration Process of Image Sensors Based on IOFBs
Fernández, Pedro R.; Lázaro, José L.; Gardel, Alfredo; Cano, Ángel E.; Bravo, Ignacio
2010-01-01
A focusing procedure in the calibration process of image sensors based on Incoherent Optical Fiber Bundles (IOFBs) is described using the information extracted from fibers. These procedures differ from any other currently known focusing method due to the non spatial in-out correspondence between fibers, which produces a natural codification of the image to transmit. Focus measuring is essential prior to carrying out calibration in order to guarantee accurate processing and decoding. Four algorithms have been developed to estimate the focus measure; two methods based on mean grey level, and the other two based on variance. In this paper, a few simple focus measures are defined and compared. Some experimental results referred to the focus measure and the accuracy of the developed methods are discussed in order to demonstrate its effectiveness. PMID:22315526
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crawford, J M; Ehinger, M H; Joseph, C
1978-10-01
Development work on a computerized system for nuclear materials control and accounting in a nuclear fuel reprocessing plant is described and evaluated. Hardware and software were installed and tested to demonstrate key measurement, measurement control, and accounting requirements at accountability input/output points using natural uranium. The demonstration included a remote data acquisition system which interfaces process and special instrumentation to a cenral processing unit.
Developing image processing meta-algorithms with data mining of multiple metrics.
Leung, Kelvin; Cunha, Alexandre; Toga, A W; Parker, D Stott
2014-01-01
People often use multiple metrics in image processing, but here we take a novel approach of mining the values of batteries of metrics on image processing results. We present a case for extending image processing methods to incorporate automated mining of multiple image metric values. Here by a metric we mean any image similarity or distance measure, and in this paper we consider intensity-based and statistical image measures and focus on registration as an image processing problem. We show how it is possible to develop meta-algorithms that evaluate different image processing results with a number of different metrics and mine the results in an automated fashion so as to select the best results. We show that the mining of multiple metrics offers a variety of potential benefits for many image processing problems, including improved robustness and validation.
Alcohol and Sexual Consent Scale: Development and Validation
ERIC Educational Resources Information Center
Ward, Rose Marie; Matthews, Molly R.; Weiner, Judith; Hogan, Kathryn M.; Popson, Halle C.
2012-01-01
Objective: To establish a short measure of attitudes toward sexual consent in the context of alcohol consumption. Methods: Using a multistage and systematic measurement development process, the investigators developed the Alcohol and Sexual Consent Scale using a sample of college students. Results: The resulting 12-item scale, the Alcohol and…
NASA Astrophysics Data System (ADS)
Kannan, Rohit; Tangirala, Arun K.
2014-06-01
Identification of directional influences in multivariate systems is of prime importance in several applications of engineering and sciences such as plant topology reconstruction, fault detection and diagnosis, and neurosciences. A spectrum of related directionality measures, ranging from linear measures such as partial directed coherence (PDC) to nonlinear measures such as transfer entropy, have emerged over the past two decades. The PDC-based technique is simple and effective, but being a linear directionality measure has limited applicability. On the other hand, transfer entropy, despite being a robust nonlinear measure, is computationally intensive and practically implementable only for bivariate processes. The objective of this work is to develop a nonlinear directionality measure, termed as KPDC, that possesses the simplicity of PDC but is still applicable to nonlinear processes. The technique is founded on a nonlinear measure called correntropy, a recently proposed generalized correlation measure. The proposed method is equivalent to constructing PDC in a kernel space where the PDC is estimated using a vector autoregressive model built on correntropy. A consistent estimator of the KPDC is developed and important theoretical results are established. A permutation scheme combined with the sequential Bonferroni procedure is proposed for testing hypothesis on absence of causality. It is demonstrated through several case studies that the proposed methodology effectively detects Granger causality in nonlinear processes.
Trifigny, Nicolas; Kelly, Fern M.; Cochrane, Cédric; Boussu, François; Koncar, Vladan; Soulat, Damien
2013-01-01
The quality of fibrous reinforcements used in composite materials can be monitored during the weaving process. Fibrous sensors previously developed in our laboratory, based on PEDOT:PSS, have been adapted so as to directly measure the mechanical stress on fabrics under static or dynamic conditions. The objective of our research has been to develop new sensor yarns, with the ability to locally detect mechanical stresses all along the warp or weft yarn. This local detection is undertaken inside the weaving loom in real time during the weaving process. Suitable electronic devices have been designed in order to record in situ measurements delivered by this new fibrous sensor yarn. PMID:23959238
DEVELOPMENT AND APPLICATION OF PLANNING PROCESS TO ACHIEVE SUSTAINABILITY
Concepts of sustainability are numerous, widely discussed, and necessary, but sustainability needs to be applied to development projects to succeed. However, few applications are made and their measures are unclear. Sustainability indicators are typically used as measures, but ...
ERIC Educational Resources Information Center
Minix, Nancy; And Others
The process used to evaluate progress in identifying the goals to be used in evaluating teacher performance under the Kentucky Career Ladder Program is described. The process pertains to two areas of teacher development: (1) professional growth and development, and (2) professional leadership and initiative. A total of 1,650 individuals were asked…
Soft sensor for monitoring biomass subpopulations in mammalian cell culture processes.
Kroll, Paul; Stelzer, Ines V; Herwig, Christoph
2017-11-01
Biomass subpopulations in mammalian cell culture processes cause impurities and influence productivity, which requires this critical process parameter to be monitored in real-time. For this reason, a novel soft sensor concept for estimating viable, dead and lysed cell concentration was developed, based on the robust and cheap in situ measurements of permittivity and turbidity in combination with a simple model. It could be shown that the turbidity measurements contain information about all investigated biomass subpopulations. The novelty of the developed soft sensor is the real-time estimation of lysed cell concentration, which is directly correlated to process-related impurities such as DNA and host cell protein in the supernatant. Based on data generated by two fed-batch processes the developed soft sensor is described and discussed. The presented soft sensor concept provides a tool for viable, dead and lysed cell concentration estimation in real-time with adequate accuracy and enables further applications with respect to process optimization and control.
Validation of the Vanderbilt Holistic Face Processing Test.
Wang, Chao-Chih; Ross, David A; Gauthier, Isabel; Richler, Jennifer J
2016-01-01
The Vanderbilt Holistic Face Processing Test (VHPT-F) is a new measure of holistic face processing with better psychometric properties relative to prior measures developed for group studies (Richler et al., 2014). In fields where psychologists study individual differences, validation studies are commonplace and the concurrent validity of a new measure is established by comparing it to an older measure with established validity. We follow this approach and test whether the VHPT-F measures the same construct as the composite task, which is group-based measure at the center of the large literature on holistic face processing. In Experiment 1, we found a significant correlation between holistic processing measured in the VHPT-F and the composite task. Although this correlation was small, it was comparable to the correlation between holistic processing measured in the composite task with the same faces, but different target parts (top or bottom), which represents a reasonable upper limit for correlations between the composite task and another measure of holistic processing. These results confirm the validity of the VHPT-F by demonstrating shared variance with another measure of holistic processing based on the same operational definition. These results were replicated in Experiment 2, but only when the demographic profile of our sample matched that of Experiment 1.
Validation of the Vanderbilt Holistic Face Processing Test
Wang, Chao-Chih; Ross, David A.; Gauthier, Isabel; Richler, Jennifer J.
2016-01-01
The Vanderbilt Holistic Face Processing Test (VHPT-F) is a new measure of holistic face processing with better psychometric properties relative to prior measures developed for group studies (Richler et al., 2014). In fields where psychologists study individual differences, validation studies are commonplace and the concurrent validity of a new measure is established by comparing it to an older measure with established validity. We follow this approach and test whether the VHPT-F measures the same construct as the composite task, which is group-based measure at the center of the large literature on holistic face processing. In Experiment 1, we found a significant correlation between holistic processing measured in the VHPT-F and the composite task. Although this correlation was small, it was comparable to the correlation between holistic processing measured in the composite task with the same faces, but different target parts (top or bottom), which represents a reasonable upper limit for correlations between the composite task and another measure of holistic processing. These results confirm the validity of the VHPT-F by demonstrating shared variance with another measure of holistic processing based on the same operational definition. These results were replicated in Experiment 2, but only when the demographic profile of our sample matched that of Experiment 1. PMID:27933014
Automated method for the systematic interpretation of resonance peaks in spectrum data
Damiano, B.; Wood, R.T.
1997-04-22
A method is described for spectral signature interpretation. The method includes the creation of a mathematical model of a system or process. A neural network training set is then developed based upon the mathematical model. The neural network training set is developed by using the mathematical model to generate measurable phenomena of the system or process based upon model input parameter that correspond to the physical condition of the system or process. The neural network training set is then used to adjust internal parameters of a neural network. The physical condition of an actual system or process represented by the mathematical model is then monitored by extracting spectral features from measured spectra of the actual process or system. The spectral features are then input into said neural network to determine the physical condition of the system or process represented by the mathematical model. More specifically, the neural network correlates the spectral features (i.e. measurable phenomena) of the actual process or system with the corresponding model input parameters. The model input parameters relate to specific components of the system or process, and, consequently, correspond to the physical condition of the process or system. 1 fig.
Automated method for the systematic interpretation of resonance peaks in spectrum data
Damiano, Brian; Wood, Richard T.
1997-01-01
A method for spectral signature interpretation. The method includes the creation of a mathematical model of a system or process. A neural network training set is then developed based upon the mathematical model. The neural network training set is developed by using the mathematical model to generate measurable phenomena of the system or process based upon model input parameter that correspond to the physical condition of the system or process. The neural network training set is then used to adjust internal parameters of a neural network. The physical condition of an actual system or process represented by the mathematical model is then monitored by extracting spectral features from measured spectra of the actual process or system. The spectral features are then input into said neural network to determine the physical condition of the system or process represented by the mathematical. More specifically, the neural network correlates the spectral features (i.e. measurable phenomena) of the actual process or system with the corresponding model input parameters. The model input parameters relate to specific components of the system or process, and, consequently, correspond to the physical condition of the process or system.
Measuring End-of-Life Care Processes in Nursing Homes
ERIC Educational Resources Information Center
Temkin-Greener, Helena; Zheng, Nan; Norton, Sally A.; Quill, Timothy; Ladwig, Susan; Veazie, Peter
2009-01-01
Purpose: The objectives of this study were to develop measures of end-of-life (EOL) care processes in nursing homes and to validate the instrument for measuring them. Design and Methods: A survey of directors of nursing was conducted in 608 eligible nursing homes in New York State. Responses were obtained from 313 (51.5% response rate) facilities.…
Thermographic Measurements of the Commercial Laser Powder Bed Fusion Process at NIST
Lane, Brandon; Moylan, Shawn; Whitenton, Eric; Ma, Li
2016-01-01
Measurement of the high-temperature melt pool region in the laser powder bed fusion (L-PBF) process is a primary focus of researchers to further understand the dynamic physics of the heating, melting, adhesion, and cooling which define this commercially popular additive manufacturing process. This paper will detail the design, execution, and results of high speed, high magnification in-situ thermographic measurements conducted at the National Institute of Standards and Technology (NIST) focusing on the melt pool region of a commercial L-PBF process. Multiple phenomena are observed including plasma plume and hot particle ejection from the melt region. The thermographic measurement process will be detailed with emphasis on the ‘measurability’ of observed phenomena and the sources of measurement uncertainty. Further discussion will relate these thermographic results to other efforts at NIST towards L-PBF process finite element simulation and development of in-situ sensing and control methodologies. PMID:28058036
ERIC Educational Resources Information Center
Koskey, Kristin L. K.; Sondergeld, Toni A.; Stewart, Victoria C.; Pugh, Kevin J.
2018-01-01
Onwuegbuzie and colleagues proposed the Instrument Development and Construct Validation (IDCV) process as a mixed methods framework for creating and validating measures. Examples applying IDCV are lacking. We provide an illustrative case integrating the Rasch model and cognitive interviews applied to the development of the Transformative…
Student Team Projects in Information Systems Development: Measuring Collective Creative Efficacy
ERIC Educational Resources Information Center
Cheng, Hsiu-Hua; Yang, Heng-Li
2011-01-01
For information systems development project student teams, learning how to improve software development processes is an important training. Software process improvement is an outcome of a number of creative behaviours. Social cognitive theory states that the efficacy of judgment influences behaviours. This study explores the impact of three types…
An Analytic Hierarchy Process for School Quality and Inspection: Model Development and Application
ERIC Educational Resources Information Center
Al Qubaisi, Amal; Badri, Masood; Mohaidat, Jihad; Al Dhaheri, Hamad; Yang, Guang; Al Rashedi, Asma; Greer, Kenneth
2016-01-01
Purpose: The purpose of this paper is to develop an analytic hierarchy planning-based framework to establish criteria weights and to develop a school performance system commonly called school inspections. Design/methodology/approach: The analytic hierarchy process (AHP) model uses pairwise comparisons and a measurement scale to generate the…
ATM Coastal Topography-Alabama 2001
Nayegandhi, Amar; Yates, Xan; Brock, John C.; Sallenger, A.H.; Bonisteel, Jamie M.; Klipp, Emily S.; Wright, C. Wayne
2009-01-01
These remotely sensed, geographically referenced elevation measurements of Lidar-derived first surface (FS) topography were produced collaboratively by the U.S. Geological Survey (USGS), Florida Integrated Science Center (FISC), St. Petersburg, FL, and the National Aeronautics and Space Administration (NASA), Wallops Flight Facility, VA. This project provides highly detailed and accurate datasets of the Alabama coastline, acquired October 3-4, 2001. The datasets are made available for use as a management tool to research scientists and natural resource managers. An innovative scanning Lidar instrument originally developed by NASA, and known as the Airborne Topographic Mapper (ATM), was used during data acquisition. The ATM system is a scanning Lidar system that measures high-resolution topography of the land surface, and incorporates a green-wavelength laser operating at pulse rates of 2 to 10 kilohertz. Measurements from the laser ranging device are coupled with data acquired from inertial navigation system (INS) attitude sensors and differentially corrected global positioning system (GPS) receivers to measure topography of the surface at accuracies of +/-15 centimeters. The nominal ATM platform is a Twin Otter or P-3 Orion aircraft, but the instrument may be deployed on a range of light aircraft. Elevation measurements were collected over the survey area using the ATM system, and the resulting data were then processed using the Airborne Lidar Processing System (ALPS), a custom-built processing system developed in a NASA-USGS collaboration. ALPS supports the exploration and processing of Lidar data in an interactive or batch mode. Modules for pre-survey flight line definition, flight path plotting, Lidar raster and waveform investigation, and digital camera image playback have been developed. Processing algorithms have been developed to extract the range to the first and last significant return within each waveform. ALPS is routinely used to create maps that represent submerged or first surface topography.
ATM Coastal Topography-Florida 2001: Eastern Panhandle
Yates, Xan; Nayegandhi, Amar; Brock, John C.; Sallenger, A.H.; Bonisteel, Jamie M.; Klipp, Emily S.; Wright, C. Wayne
2009-01-01
These remotely sensed, geographically referenced elevation measurements of Lidar-derived first surface (FS) topography were produced collaboratively by the U.S. Geological Survey (USGS), Florida Integrated Science Center (FISC), St. Petersburg, FL, and the National Aeronautics and Space Administration (NASA), Wallops Flight Facility, VA. This project provides highly detailed and accurate datasets of the eastern Florida panhandle coastline, acquired October 2, 2001. The datasets are made available for use as a management tool to research scientists and natural resource managers. An innovative scanning Lidar instrument originally developed by NASA, and known as the Airborne Topographic Mapper (ATM), was used during data acquisition. The ATM system is a scanning Lidar system that measures high-resolution topography of the land surface and incorporates a green-wavelength laser operating at pulse rates of 2 to 10 kilohertz. Measurements from the laser-ranging device are coupled with data acquired from inertial navigation system (INS) attitude sensors and differentially corrected global positioning system (GPS) receivers to measure topography of the surface at accuracies of +/-15 centimeters. The nominal ATM platform is a Twin Otter or P-3 Orion aircraft, but the instrument may be deployed on a range of light aircraft. Elevation measurements were collected over the survey area using the ATM system, and the resulting data were then processed using the Airborne Lidar Processing System (ALPS), a custom-built processing system developed in a NASA-USGS collaboration. ALPS supports the exploration and processing of Lidar data in an interactive or batch mode. Modules for presurvey flight line definition, flight path plotting, Lidar raster and waveform investigation, and digital camera image playback have been developed. Processing algorithms have been developed to extract the range to the first and last significant return within each waveform. ALPS is routinely used to create maps that represent submerged or first surface topography.
[Current state of competence assessment in nursing].
Darmann-Finck, Ingrid; Reuschenbach, Bernd
2013-01-01
Competency measurement is central to the optimisation of outcome oriented educational processes in nursing, similar to the concept of evidence based practice. The classification of measurement tools provides the basis for describing the current state of research and development in relation to competence measurement in nursing science, and any gaps are identified. The article concludes with questioning the importance of outcome oriented quality orientation in order to achieve an increase in quality during training. Further methodological developments and qualitative studies are needed to examine the context specific processes of interaction and learning, beyond competence diagnostics. Copyright © 2012. Published by Elsevier GmbH.
ERIC Educational Resources Information Center
Swingler, Margaret M.; Perry, Nicole B.; Calkins, Susan D.; Bell, Martha Ann
2017-01-01
We apply a biopsychosocial conceptualization to attention development in the 1st year and examine the role of neurophysiological and social processes on the development of early attention processes. We tested whether maternal behavior measured during 2 mother-child interaction tasks when infants (N = 388) were 5 months predicted infant medial…
ERIC Educational Resources Information Center
Archibald, Lisa M. D.; Gathercole, Susan E.
2007-01-01
This study investigated the verbal and visuospatial processing and storage skills of children with SLI and typically developing children. Fourteen school-age children with SLI, and two groups of typically developing children matched either for age or language abilities, completed measures of processing speed and storage capacity, and a set of…
NASA Technical Reports Server (NTRS)
Bentley, P. B.
1975-01-01
The measurement of the volume flow-rate of blood in an artery or vein requires both an estimate of the flow velocity and its spatial distribution and the corresponding cross-sectional area. Transcutaneous measurements of these parameters can be performed using ultrasonic techniques that are analogous to the measurement of moving objects by use of a radar. Modern digital data recording and preprocessing methods were applied to the measurement of blood-flow velocity by means of the CW Doppler ultrasonic technique. Only the average flow velocity was measured and no distribution or size information was obtained. Evaluations of current flowmeter design and performance, ultrasonic transducer fabrication methods, and other related items are given. The main thrust was the development of effective data-handling and processing methods by application of modern digital techniques. The evaluation resulted in useful improvements in both the flowmeter instrumentation and the ultrasonic transducers. Effective digital processing algorithms that provided enhanced blood-flow measurement accuracy and sensitivity were developed. Block diagrams illustrative of the equipment setup are included.
DOT National Transportation Integrated Search
2011-01-01
This study develops an enhanced transportation planning framework by augmenting the sequential four-step : planning process with post-processing techniques. The post-processing techniques are incorporated through a feedback : mechanism and aim to imp...
Sudore, Rebecca L.; Stewart, Anita L.; Knight, Sara J.; McMahan, Ryan D.; Feuz, Mariko; Miao, Yinghui; Barnes, Deborah E.
2013-01-01
Introduction Advance directives have traditionally been considered the gold standard for advance care planning. However, recent evidence suggests that advance care planning involves a series of multiple discrete behaviors for which people are in varying stages of behavior change. The goal of our study was to develop and validate a survey to measure the full advance care planning process. Methods The Advance Care Planning Engagement Survey assesses “Process Measures” of factors known from Behavior Change Theory to affect behavior (knowledge, contemplation, self-efficacy, and readiness, using 5-point Likert scales) and “Action Measures” (yes/no) of multiple behaviors related to surrogate decision makers, values and quality of life, flexibility for surrogate decision making, and informed decision making. We administered surveys at baseline and 1 week later to 50 diverse, older adults from San Francisco hospitals. Internal consistency reliability of Process Measures was assessed using Cronbach's alpha (only continuous variables) and test-retest reliability of Process and Action Measures was examined using intraclass correlations. For discriminant validity, we compared Process and Action Measure scores between this cohort and 20 healthy college students (mean age 23.2 years, SD 2.7). Results Mean age was 69.3 (SD 10.5) and 42% were non-White. The survey took a mean of 21.4 minutes (±6.2) to administer. The survey had good internal consistency (Process Measures Cronbach's alpha, 0.94) and test-retest reliability (Process Measures intraclass correlation, 0.70; Action Measures, 0.87). Both Process and Action Measure scores were higher in the older than younger group, p<.001. Conclusion A new Advance Care Planning Engagement Survey that measures behavior change (knowledge, contemplation, self-efficacy, and readiness) and multiple advance care planning actions demonstrates good reliability and validity. Further research is needed to assess whether survey scores improve in response to advance care planning interventions and whether scores are associated with receipt of care consistent with one's wishes. PMID:24039772
Habituation of the orienting reflex and the development of Preliminary Process Theory.
Barry, Robert J
2009-09-01
The orienting reflex (OR), elicited by an innocuous stimulus, can be regarded as a model of the organism's interaction with its environment, and has been described as the unit of attentional processing. A major determinant of the OR is the novelty of the eliciting stimulus, generally operationalized in terms of its reduction with stimulus repetition, the effects of which are commonly described in habituation terms. This paper provides an overview of a research programme, spanning more than 30 years, investigating psychophysiological aspects of the OR in humans. The major complication in this research is that the numerous physiological measures used as dependent variables in the OR context fail to jointly covary with stimulus parameters. This has led to the development of the Preliminary Process Theory (PPT) of the OR to accommodate the complexity of the observed stimulus-response patterns. PPT is largely grounded in autonomic measures, and current work is attempting to integrate electroencephalographic measures, particularly components in the event-related brain potentials reflecting aspects of stimulus processing. The emphasis in the current presentation is on the use of the defining criteria of the habituation phenomenon, and Groves and Thompson's Dual-process Theory, in the development of PPT.
Quality Measures for the Care of Adult Patients with Obstructive Sleep Apnea
Aurora, R. Nisha; Collop, Nancy A.; Jacobowitz, Ofer; Thomas, Sherene M.; Quan, Stuart F.; Aronsky, Amy J.
2015-01-01
Obstructive sleep apnea (OSA) is a prevalent disorder associated with a multitude of adverse outcomes when left untreated. There is significant heterogeneity in the evaluation and management of OSA resulting in variation in cost and outcomes. Thus, the goal for developing these measures was to have a way to evaluate the outcomes and reliability of the processes involved with the standard care approaches used in the diagnosis and management of OSA. The OSA quality care measures presented here focus on both outcomes and processes. The AASM commissioned the Adult OSA Quality Measures Workgroup to develop quality care measures aimed at optimizing care for adult patients with OSA. These quality care measures developed by the Adult OSA Quality Measures Workgroup are an extension of the original Centers for Medicare & Medicaid Services (CMS) approved Physician Quality Reporting System (PQRS) measures group for OSA. The measures are based on the available scientific evidence, focus on public safety, and strive to improve quality of life and cardiovascular outcomes for individual OSA patients. The three outcomes that were selected were as follows: (1) improve disease detection and categorization; (2) improve quality of life; and (3) reduce cardiovascular risk. After selecting these relevant outcomes, a total of ten process measures were chosen that could be applied and assessed for the purpose of accomplishing these outcomes. In the future, the measures described in this document may be reported through the PQRS in addition to, or as a replacement for, the current OSA measures group. The overall objective for the development of these measures is that implementation of these quality measures will result in improved patient outcomes, reduce the public health burden of OSA, and provide a measurable standard for evaluating and managing OSA. Citation: Aurora RN, Collop NA, Jacobowitz O, Thomas SM, Quan SF, Aronsky AJ. Quality measures for the care of adult patients with obstructive sleep apnea. J Clin Sleep Med 2015;11(3):357–383. PMID:25700878
NASA Technical Reports Server (NTRS)
Stoughton, R. M.
1990-01-01
A proposed methodology applicable to the design of manipulator systems is described. The current design process is especially weak in the preliminary design phase, since there is no accepted measure to be used in trading off different options available for the various subsystems. The design process described uses Cartesian End-Effector Impedance as a measure of performance for the system. Having this measure of performance, it is shown how it may be used to determine the trade-offs necessary to the preliminary design phase. The design process involves three main parts: (1) determination of desired system performance in terms of End-Effector Impedance; (2) trade-off design options to achieve this desired performance; and (3) verification of system performance through laboratory testing. The design process is developed using numerous examples and experiments to demonstrate the feasability of this approach to manipulator design.
BIOMONIITORING RESEARCH WITHIN THE U.S. EPA'S OFFICE OF RESEARCH AND DEVELOPMENT
Current ORD exposure research is directed toward developing the processes, tools, and information to put biomonitoring data into perspective for the risk assessment process, to define the appropriate uses of specific biomarkers, and to integrate biomarker measurements with exposu...
Development of critical dimension measurement scanning electron microscope for ULSI (S-8000 series)
NASA Astrophysics Data System (ADS)
Ezumi, Makoto; Otaka, Tadashi; Mori, Hiroyoshi; Todokoro, Hideo; Ose, Yoichi
1996-05-01
The semiconductor industry is moving from half-micron to quarter-micron design rules. To support this evolution, Hitachi has developed a new critical dimension measurement scanning electron microscope (CD-SEM), the model S-8800 series, for quality control of quarter- micron process lines. The new CD-SEM provides detailed examination of process conditions with 5 nm resolution and 5 nm repeatability (3 sigma) at accelerating voltage 800 V using secondary electron imaging. In addition, a newly developed load-lock system has a capability of achieving a high sample throughput of 20 wafers/hour (5 point measurements per wafer) under continuous operation. To support user friendliness, the system incorporates a graphical user interface (GUI), an automated pattern recognition system which helps locating measurement points, both manual and semi-automated operation, and user-programmable operating parameters.
Caffeine expectancy: instrument development in the Rasch measurement framework.
Heinz, Adrienne J; Kassel, Jon D; Smith, Everett V
2009-09-01
Although caffeine is the most widely consumed psychoactive drug in the world, the mechanisms associated with consumption are not well understood. Nonetheless, outcome expectancies for caffeine use are thought to underlie caffeine's reinforcing properties. To date, however, there is no available, sufficient measure by which to assess caffeine expectancy. Therefore, the current study sought to develop such a measure employing Rasch measurement models. Unlike traditional measurement development techniques, Rasch analyses afford dynamic and interactive control of the analysis process and generate helpful information to guide instrument construction. A 5-stage developmental process is described, ultimately yielding a 37-item Caffeine Expectancy Questionnaire (CEQ) comprised of 4 factors representing "withdrawal symptoms," "positive effects," "acute negative effects," and "mood effects." Initial evaluation of the CEQ yielded sufficient evidence for various aspects of validity. Although additional research with more heterogeneous samples is required to further assess the measure's reliability and validity, the CEQ demonstrates potential with regard to its utility in experimental laboratory research and clinical application. 2009 APA, all rights reserved.
NASA Astrophysics Data System (ADS)
Kelber, C.; Marke, S.; Trommler, U.; Rupprecht, C.; Weis, S.
2017-03-01
Thermal spraying processes are becoming increasingly important in high-technology areas, such as automotive engineering and medical technology. The method offers the advantage of a local layer application with different materials and high deposition rates. Challenges in the application of thermal spraying result from the complex interaction of different influencing variables, which can be attributed to the properties of different materials, operating equipment supply, electrical parameters, flow mechanics, plasma physics and automation. In addition, spraying systems are subject to constant wear. Due to the process specification and the high demands on the produced coatings, innovative quality assurance tools are necessary. A central aspect, which has not yet been considered, is the data management in relation to the present measured variables, in particular the spraying system, the handling system, working safety devices and additional measuring sensors. Both the recording of all process-characterizing variables, their linking and evaluation as well as the use of the data for the active process control presuppose a novel, innovative control system (hardware and software) that was to be developed within the scope of the research project. In addition, new measurement methods and sensors are to be developed and qualified in order to improve the process reliability of thermal spraying.
Three-dimensional measurement system for crime scene documentation
NASA Astrophysics Data System (ADS)
Adamczyk, Marcin; Hołowko, Elwira; Lech, Krzysztof; Michoński, Jakub; MÄ czkowski, Grzegorz; Bolewicki, Paweł; Januszkiewicz, Kamil; Sitnik, Robert
2017-10-01
Three dimensional measurements (such as photogrammetry, Time of Flight, Structure from Motion or Structured Light techniques) are becoming a standard in the crime scene documentation process. The usage of 3D measurement techniques provide an opportunity to prepare more insightful investigation and helps to show every trace in the context of the entire crime scene. In this paper we would like to present a hierarchical, three-dimensional measurement system that is designed for crime scenes documentation process. Our system reflects the actual standards in crime scene documentation process - it is designed to perform measurement in two stages. First stage of documentation, the most general, is prepared with a scanner with relatively low spatial resolution but also big measuring volume - it is used for the whole scene documentation. Second stage is much more detailed: high resolution but smaller size of measuring volume for areas that required more detailed approach. The documentation process is supervised by a specialised application CrimeView3D, that is a software platform for measurements management (connecting with scanners and carrying out measurements, automatic or semi-automatic data registration in the real time) and data visualisation (3D visualisation of documented scenes). It also provides a series of useful tools for forensic technicians: virtual measuring tape, searching for sources of blood spatter, virtual walk on the crime scene and many others. In this paper we present our measuring system and the developed software. We also provide an outcome from research on metrological validation of scanners that was performed according to VDI/VDE standard. We present a CrimeView3D - a software-platform that was developed to manage the crime scene documentation process. We also present an outcome from measurement sessions that were conducted on real crime scenes with cooperation with Technicians from Central Forensic Laboratory of Police.
Ackerman, Sara L; Gourley, Gato; Le, Gem; Williams, Pamela; Yazdany, Jinoos; Sarkar, Urmimala
2018-03-14
The aim of the study was to develop standards for tracking patient safety gaps in ambulatory care in safety net health systems. Leaders from five California safety net health systems were invited to participate in a modified Delphi process sponsored by the Safety Promotion Action Research and Knowledge Network (SPARKNet) and the California Safety Net Institute in 2016. During each of the three Delphi rounds, the feasibility and validity of 13 proposed patient safety measures were discussed and prioritized. Surveys and transcripts from the meetings were analyzed to understand the decision-making process. The Delphi process included eight panelists. Consensus was reached to adopt 9 of 13 proposed measures. All 9 measures were unanimously considered valid, but concern was expressed about the feasibility of implementing several of the measures. Although safety net health systems face high barriers to standardized measurement, our study demonstrates that consensus can be reached on acceptable and feasible methods for tracking patient safety gaps in safety net health systems. If accompanied by the active participation key stakeholder groups, including patients, clinicians, staff, data system professionals, and health system leaders, the consensus measures reported here represent one step toward improving ambulatory patient safety in safety net health systems.
ERIC Educational Resources Information Center
Schoen, Robert C.; Bray, Wendy; Wolfe, Christopher; Tazaz, Amanda M.; Nielsen, Lynne
2017-01-01
This study reports on the development and field study of K-TEEM, a web-based assessment instrument designed to measure mathematical knowledge for teaching (MKT) at the early elementary level. The development process involved alignment with early elementary curriculum standards, expert review of items and scoring criteria, cognitive interviews with…
Development of an Instrument to Measure Medical Students' Attitudes toward People with Disabilities
ERIC Educational Resources Information Center
Symons, Andrew B.; Fish, Reva; McGuigan, Denise; Fox, Jeffery; Akl, Elie A.
2012-01-01
As curricula to improve medical students' attitudes toward people with disabilities are developed, instruments are needed to guide the process and evaluate effectiveness. The authors developed an instrument to measure medical students' attitudes toward people with disabilities. A pilot instrument with 30 items in four sections was administered to…
Developing Image Processing Meta-Algorithms with Data Mining of Multiple Metrics
Cunha, Alexandre; Toga, A. W.; Parker, D. Stott
2014-01-01
People often use multiple metrics in image processing, but here we take a novel approach of mining the values of batteries of metrics on image processing results. We present a case for extending image processing methods to incorporate automated mining of multiple image metric values. Here by a metric we mean any image similarity or distance measure, and in this paper we consider intensity-based and statistical image measures and focus on registration as an image processing problem. We show how it is possible to develop meta-algorithms that evaluate different image processing results with a number of different metrics and mine the results in an automated fashion so as to select the best results. We show that the mining of multiple metrics offers a variety of potential benefits for many image processing problems, including improved robustness and validation. PMID:24653748
Processing methods for photoacoustic Doppler flowmetry with a clinical ultrasound scanner
NASA Astrophysics Data System (ADS)
Bücking, Thore M.; van den Berg, Pim J.; Balabani, Stavroula; Steenbergen, Wiendelt; Beard, Paul C.; Brunker, Joanna
2018-02-01
Photoacoustic flowmetry (PAF) based on time-domain cross correlation of photoacoustic signals is a promising technique for deep tissue measurement of blood flow velocity. Signal processing has previously been developed for single element transducers. Here, the processing methods for acoustic resolution PAF using a clinical ultrasound transducer array are developed and validated using a 64-element transducer array with a -6 dB detection band of 11 to 17 MHz. Measurements were performed on a flow phantom consisting of a tube (580 μm inner diameter) perfused with human blood flowing at physiological speeds ranging from 3 to 25 mm / s. The processing pipeline comprised: image reconstruction, filtering, displacement detection, and masking. High-pass filtering and background subtraction were found to be key preprocessing steps to enable accurate flow velocity estimates, which were calculated using a cross-correlation based method. In addition, the regions of interest in the calculated velocity maps were defined using a masking approach based on the amplitude of the cross-correlation functions. These developments enabled blood flow measurements using a transducer array, bringing PAF one step closer to clinical applicability.
In-process fault detection for textile fabric production: onloom imaging
NASA Astrophysics Data System (ADS)
Neumann, Florian; Holtermann, Timm; Schneider, Dorian; Kulczycki, Ashley; Gries, Thomas; Aach, Til
2011-05-01
Constant and traceable high fabric quality is of high importance both for technical and for high-quality conventional fabrics. Usually, quality inspection is carried out by trained personal, whose detection rate and maximum period of concentration are limited. Low resolution automated fabric inspection machines using texture analysis were developed. Since 2003, systems for the in-process inspection on weaving machines ("onloom") are commercially available. With these defects can be detected, but not measured quantitative precisely. Most systems are also prone to inevitable machine vibrations. Feedback loops for fault prevention are not established. Technology has evolved since 2003: Camera and computer prices dropped, resolutions were enhanced, recording speeds increased. These are the preconditions for real-time processing of high-resolution images. So far, these new technological achievements are not used in textile fabric production. For efficient use, a measurement system must be integrated into the weaving process; new algorithms for defect detection and measurement must be developed. The goal of the joint project is the development of a modern machine vision system for nondestructive onloom fabric inspection. The system consists of a vibration-resistant machine integration, a high-resolution machine vision system, and new, reliable, and robust algorithms with quality database for defect documentation. The system is meant to detect, measure, and classify at least 80 % of economically relevant defects. Concepts for feedback loops into the weaving process will be pointed out.
Interfacing of high temperature Z-meter setup using python
NASA Astrophysics Data System (ADS)
Patel, Ashutosh; Sisodia, Shashank; Pandey, Sudhir K.
2017-05-01
In this work, we interface high temperature Z-meter setup to automize the whole measurement process. A program is built on open source programming language `Python' which convert the manual measurement process into fully automated process without any cost addition. Using this program, simultaneous measurement of Seebeck coefficient (α), thermal conductivity (κ) and electrical resistivity (ρ), are performed and using all three, figure-of-merit (ZT) is calculated. Developed program is verified by performing measurement over p-type Bi0.36Sb1.45Te3 sample and the data obtained are found to be in good agreement with the reported data.
Software Quality Assurance Metrics
NASA Technical Reports Server (NTRS)
McRae, Kalindra A.
2004-01-01
Software Quality Assurance (SQA) is a planned and systematic set of activities that ensures conformance of software life cycle processes and products conform to requirements, standards and procedures. In software development, software quality means meeting requirements and a degree of excellence and refinement of a project or product. Software Quality is a set of attributes of a software product by which its quality is described and evaluated. The set of attributes includes functionality, reliability, usability, efficiency, maintainability, and portability. Software Metrics help us understand the technical process that is used to develop a product. The process is measured to improve it and the product is measured to increase quality throughout the life cycle of software. Software Metrics are measurements of the quality of software. Software is measured to indicate the quality of the product, to assess the productivity of the people who produce the product, to assess the benefits derived from new software engineering methods and tools, to form a baseline for estimation, and to help justify requests for new tools or additional training. Any part of the software development can be measured. If Software Metrics are implemented in software development, it can save time, money, and allow the organization to identify the caused of defects which have the greatest effect on software development. The summer of 2004, I worked with Cynthia Calhoun and Frank Robinson in the Software Assurance/Risk Management department. My task was to research and collect, compile, and analyze SQA Metrics that have been used in other projects that are not currently being used by the SA team and report them to the Software Assurance team to see if any metrics can be implemented in their software assurance life cycle process.
Validation of a Communication Process Measure for Coding Control in Counseling.
ERIC Educational Resources Information Center
Heatherington, Laurie
The increasingly popular view of the counseling process from an interactional perspective necessitates the development of new measurement instruments which are suitable to the study of the reciprocal interaction between people. The validity of the Relational Communication Coding System, an instrument which operationalizes the constructs of…
Implementing an ROI Measurement Process at Dell Computer.
ERIC Educational Resources Information Center
Tesoro, Ferdinand
1998-01-01
This return-on-investment (ROI) evaluation study determined the business impact of the sales negotiation training course to Dell Computer Corporation. A five-step ROI measurement process was used: Plan-Develop-Analyze-Communicate-Leverage. The corporate sales information database was used to compare pre- and post-training metrics for both training…
EVALUATION OF A TEST METHOD FOR MEASURING INDOOR AIR EMISSIONS FROM DRY-PROCESS PHOTOCOPIERS
A large chamber test method for measuring indoor air emissions from office equipment was developed, evaluated, and revised based on the initial testing of four dry-process photocopiers. Because all chambers may not necessarily produce similar results (e.g., due to differences in ...
Bass, Kristin M.; Drits-Esser, Dina; Stark, Louisa A.
2016-01-01
The credibility of conclusions made about the effectiveness of educational interventions depends greatly on the quality of the assessments used to measure learning gains. This essay, intended for faculty involved in small-scale projects, courses, or educational research, provides a step-by-step guide to the process of developing, scoring, and validating high-quality content knowledge assessments. We illustrate our discussion with examples from our assessments of high school students’ understanding of concepts in cell biology and epigenetics. Throughout, we emphasize the iterative nature of the development process, the importance of creating instruments aligned to the learning goals of an intervention or curricula, and the importance of collaborating with other content and measurement specialists along the way. PMID:27055776
Property measurements and solidification studies by electrostatic levitation.
Paradis, Paul-François; Yu, Jianding; Ishikawa, Takehiko; Yoda, Shinichi
2004-11-01
The National Space Development Agency of Japan has recently developed several electrostatic levitation furnaces and implemented new techniques and procedures for property measurement, solidification studies, and atomic structure research. In addition to the contamination-free environment for undercooled and liquid metals and semiconductors, the newly developed facilities possess the unique capabilities of handling ceramics and high vapor pressure materials, reducing processing time, and imaging high luminosity samples. These are exemplified in this paper with the successful processing of BaTiO(3). This allowed measurement of the density of high temperature solid, liquid, and undercooled phases. Furthermore, the material resulting from containerless solidification consisted of micrometer-size particles and a glass-like phase exhibiting a giant dielectric constant exceeding 100,000.
NASA Astrophysics Data System (ADS)
Li, Hongkai; Qu, Zilian; Zhao, Qian; Tian, Fangxin; Zhao, Dewen; Meng, Yonggang; Lu, Xinchun
2013-12-01
In recent years, a variety of film thickness measurement techniques for copper chemical mechanical planarization (CMP) are subsequently proposed. In this paper, the eddy-current technique is used. In the control system of the CMP tool developed in the State Key Laboratory of Tribology, there are in situ module and off-line module for measurement subsystem. The in situ module can get the thickness of copper film on wafer surface in real time, and accurately judge when the CMP process should stop. This is called end-point detection. The off-line module is used for multi-points measurement after CMP process, in order to know the thickness of remained copper film. The whole control system is structured with two levels, and the physical connection between the upper and the lower is achieved by the industrial Ethernet. The process flow includes calibration and measurement, and there are different algorithms for two modules. In the process of software development, C++ is chosen as the programming language, in combination with Qt OpenSource to design two modules' GUI and OPC technology to implement the communication between the two levels. In addition, the drawing function is developed relying on Matlab, enriching the software functions of the off-line module. The result shows that the control system is running stably after repeated tests and practical operations for a long time.
Li, Hongkai; Qu, Zilian; Zhao, Qian; Tian, Fangxin; Zhao, Dewen; Meng, Yonggang; Lu, Xinchun
2013-12-01
In recent years, a variety of film thickness measurement techniques for copper chemical mechanical planarization (CMP) are subsequently proposed. In this paper, the eddy-current technique is used. In the control system of the CMP tool developed in the State Key Laboratory of Tribology, there are in situ module and off-line module for measurement subsystem. The in situ module can get the thickness of copper film on wafer surface in real time, and accurately judge when the CMP process should stop. This is called end-point detection. The off-line module is used for multi-points measurement after CMP process, in order to know the thickness of remained copper film. The whole control system is structured with two levels, and the physical connection between the upper and the lower is achieved by the industrial Ethernet. The process flow includes calibration and measurement, and there are different algorithms for two modules. In the process of software development, C++ is chosen as the programming language, in combination with Qt OpenSource to design two modules' GUI and OPC technology to implement the communication between the two levels. In addition, the drawing function is developed relying on Matlab, enriching the software functions of the off-line module. The result shows that the control system is running stably after repeated tests and practical operations for a long time.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Hongkai; Qu, Zilian; Zhao, Qian
In recent years, a variety of film thickness measurement techniques for copper chemical mechanical planarization (CMP) are subsequently proposed. In this paper, the eddy-current technique is used. In the control system of the CMP tool developed in the State Key Laboratory of Tribology, there are in situ module and off-line module for measurement subsystem. The in situ module can get the thickness of copper film on wafer surface in real time, and accurately judge when the CMP process should stop. This is called end-point detection. The off-line module is used for multi-points measurement after CMP process, in order to knowmore » the thickness of remained copper film. The whole control system is structured with two levels, and the physical connection between the upper and the lower is achieved by the industrial Ethernet. The process flow includes calibration and measurement, and there are different algorithms for two modules. In the process of software development, C++ is chosen as the programming language, in combination with Qt OpenSource to design two modules’ GUI and OPC technology to implement the communication between the two levels. In addition, the drawing function is developed relying on Matlab, enriching the software functions of the off-line module. The result shows that the control system is running stably after repeated tests and practical operations for a long time.« less
A content review of cognitive process measures used in pain research within adult populations.
Day, M A; Lang, C P; Newton-John, T R O; Ehde, D M; Jensen, M P
2017-01-01
Previous research suggests that measures of cognitive process may be confounded by the inclusion of items that also assess cognitive content. The primary aims of this content review were to: (1) identify the domains of cognitive processes assessed by measures used in pain research; and (2) determine if pain-specific cognitive process measures with adequate psychometric properties exist. PsychInfo, CINAHL, PsycArticles, MEDLINE, and Academic Search Complete databases were searched to identify the measures of cognitive process used in pain research. Identified measures were double coded and the measure's items were rated as: (1) cognitive content; (2) cognitive process; (3) behavioural/social; and/or (4) emotional coping/responses to pain. A total of 319 scales were identified; of these, 29 were coded as providing an un-confounded assessment of cognitive process, and 12 were pain-specific. The cognitive process domains assessed in these measures are Absorption, Dissociation, Reappraisal, Distraction/Suppression, Acceptance, Rumination, Non-Judgment, and Enhancement. Pain-specific, un-confounded measures were identified for: Dissociation, Reappraisal, Distraction/Suppression, and Acceptance. Psychometric properties of all 319 scales are reported in supplementary material. To understand the importance of cognitive processes in influencing pain outcomes as well as explaining the efficacy of pain treatments, valid and pain-specific cognitive process measures that are not confounded with non-process domains (e.g., cognitive content) are needed. The findings of this content review suggest that future research focused on developing cognitive process measures is critical in order to advance our understanding of the mechanisms that underlie effective pain treatment. Many cognitive process measures used in pain research contain a 'mix' of items that assess cognitive process, cognitive content, and behavioural/emotional responses. Databases searched: PsychInfo, CINAHL, PsycArticles, MEDLINE and Academic Search Complete. This review describes the domains assessed by measures assessing cognitive processes in pain research, as well as the strengths and limitations of these measures. © 2016 European Pain Federation - EFIC®.
Tombaugh, Tom N; Rees, Laura; Stormer, Peter; Harrison, Allyson G; Smith, Andra
2007-01-01
In spite of the fact that reaction time (RT) measures are sensitive to the effects of traumatic brain injury (TBI), few RT procedures have been developed for use in standard clinical evaluations. The computerized test of information processing (CTIP) [Tombaugh, T. N., & Rees, L. (2000). Manual for the computerized tests of information processing (CTIP). Ottawa, Ont.: Carleton University] was designed to measure the degree to which TBI decreases the speed at which information is processed. The CTIP consists of three computerized programs that progressively increase the amount of information that is processed. Results of the current study demonstrated that RT increased as the difficulty of the CTIP tests increased (known as the complexity effect), and as severity of injury increased (from mild to severe TBI). The current study also demonstrated the importance of selecting a non-biased measure of variability. Overall, findings suggest that the CTIP is an easy to administer and sensitive measure of information processing speed.
2003-06-27
KENNEDY SPACE CENTER, FLA. - At Vandenberg Air Force Base, Calif., the Pegasus launch vehicle is moved toward its hangar. The Pegasus will carry the SciSat-1 spacecraft in a 400-mile-high polar orbit to investigate processes that control the distribution of ozone in the upper atmosphere. The scientific mission of SciSat-1 is to measure and understand the chemical processes that control the distribution of ozone in the Earth’s atmosphere, particularly at high altitudes. The data from the satellite will provide Canadian and international scientists with improved measurements relating to global ozone processes and help policymakers assess existing environmental policy and develop protective measures for improving the health of our atmosphere, preventing further ozone depletion. The mission is designed to last two years.
2003-06-27
KENNEDY SPACE CENTER, FLA. - The Pegasus launch vehicle is moved back to its hangar at Vandenberg Air Force Base, Calif. The Pegasus will carry the SciSat-1 spacecraft in a 400-mile-high polar orbit to investigate processes that control the distribution of ozone in the upper atmosphere. The scientific mission of SciSat-1 is to measure and understand the chemical processes that control the distribution of ozone in the Earth’s atmosphere, particularly at high altitudes. The data from the satellite will provide Canadian and international scientists with improved measurements relating to global ozone processes and help policymakers assess existing environmental policy and develop protective measures for improving the health of our atmosphere, preventing further ozone depletion. The mission is designed to last two years.
2003-06-26
KENNEDY SPACE CENTER, FLA. - The SciSat-1 spacecraft is uncrated at Vandenberg Air Force Base, Calif. SciSat-1 weighs approximately 330 pounds and will be placed in a 400-mile-high polar orbit to investigate processes that control the distribution of ozone in the upper atmosphere. The scientific mission of SciSat-1 is to measure and understand the chemical processes that control the distribution of ozone in the Earth’s atmosphere, particularly at high altitudes. The data from the satellite will provide Canadian and international scientists with improved measurements relating to global ozone processes and help policymakers assess existing environmental policy and develop protective measures for improving the health of our atmosphere, preventing further ozone depletion. The mission is designed to last two years.
2003-06-26
KENNEDY SPACE CENTER, FLA. - The SciSat-1 spacecraft is revealed after being uncrated at Vandenberg Air Force Base, Calif. SciSat-1 weighs approximately 330 pounds and will be placed in a 400-mile-high polar orbit to investigate processes that control the distribution of ozone in the upper atmosphere. The scientific mission of SciSat-1 is to measure and understand the chemical processes that control the distribution of ozone in the Earth’s atmosphere, particularly at high altitudes. The data from the satellite will provide Canadian and international scientists with improved measurements relating to global ozone processes and help policymakers assess existing environmental policy and develop protective measures for improving the health of our atmosphere, preventing further ozone depletion. The mission is designed to last two years.
2003-06-26
KENNEDY SPACE CENTER, FLA. - Workers at Vandenberg Air Force Base, Calif., prepare to move the SciSat-1 spacecraft. SciSat-1 weighs approximately 330 pounds and will be placed in a 400-mile-high polar orbit to investigate processes that control the distribution of ozone in the upper atmosphere. The scientific mission of SciSat-1 is to measure and understand the chemical processes that control the distribution of ozone in the Earth’s atmosphere, particularly at high altitudes. The data from the satellite will provide Canadian and international scientists with improved measurements relating to global ozone processes and help policymakers assess existing environmental policy and develop protective measures for improving the health of our atmosphere, preventing further ozone depletion. The mission is designed to last two years.
2003-06-27
KENNEDY SPACE CENTER, FLA. - At Vandenberg Air Force Base, Calif., the Pegasus launch vehicle is moved into its hangar. The Pegasus will carry the SciSat-1 spacecraft in a 400-mile-high polar orbit to investigate processes that control the distribution of ozone in the upper atmosphere. The scientific mission of SciSat-1 is to measure and understand the chemical processes that control the distribution of ozone in the Earth’s atmosphere, particularly at high altitudes. The data from the satellite will provide Canadian and international scientists with improved measurements relating to global ozone processes and help policymakers assess existing environmental policy and develop protective measures for improving the health of our atmosphere, preventing further ozone depletion. The mission is designed to last two years.
Viscosity Meaurement Technique for Metal Fuels
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ban, Heng; Kennedy, Rory
2015-02-09
Metallic fuels have exceptional transient behavior, excellent thermal conductivity, and a more straightforward reprocessing path, which does not separate out pure plutonium from the process stream. Fabrication of fuel containing minor actinides and rare earth (RE) elements for irradiation tests, for instance, U-20Pu-3Am-2Np-1.0RE-15Zr samples at the Idaho National Laboratory, is generally done by melt casting in an inert atmosphere. For the design of a casting system and further scale up development, computational modeling of the casting process is needed to provide information on melt flow and solidification for process optimization. Therefore, there is a need for melt viscosity data, themore » most important melt property that controls the melt flow. The goal of the project was to develop a measurement technique that uses fully sealed melt sample with no Americium vapor loss to determine the viscosity of metallic melts and at temperatures relevant to the casting process. The specific objectives of the project were to: develop mathematical models to establish the principle of the measurement method, design and build a viscosity measurement prototype system based on the established principle, and calibrate the system and quantify the uncertainty range. The result of the project indicates that the oscillation cup technique is applicable for melt viscosity measurement. Detailed mathematical models of innovative sample ampoule designs were developed to not only determine melt viscosity, but also melt density under certain designs. Measurement uncertainties were analyzed and quantified. The result of this project can be used as the initial step toward the eventual goal of establishing a viscosity measurement system for radioactive melts.« less
Measuring Perceptions of Engagement in Teamwork in Youth Development Programs
ERIC Educational Resources Information Center
Cater, Melissa; Jones, Kimberly Y.
2014-01-01
The literature regarding teamwork has supported the idea that the key to improving team performance is to understand team processes. Early work within the realm of teamwork focused on quantifiable measures of team performance, like number of products developed. The measure of a successful team hinged on whether or not the team accomplished the end…
Candidate Quality Measures for Hand Surgery.
2017-11-01
Quality measures are tools used by physicians, health care systems, and payers to evaluate performance, monitor the outcomes of interventions, and inform quality improvement efforts. A paucity of quality measures exist that address hand surgery care. We completed a RAND/UCLA (University of California Los Angeles) Delphi Appropriateness process with the goal of developing and evaluating candidate hand surgery quality measures to be used for national quality measure development efforts. A consortium of 9 academic upper limb surgeons completed a RAND/UCLA Delphi Appropriateness process to evaluate the importance, scientific acceptability, usability, and feasibility of 44 candidate quality measures. These addressed hand problems the panelists felt were most appropriate for quality measure development. Panelists rated the measures on an ordinal scale between 1 (definitely not valid) and 9 (definitely valid) in 2 rounds (preliminary round and final round) with an intervening face-to-face discussion. Ratings from 1 to 3 were considered not valid, 4 to 6 as equivocal or uncertain, and 7 to 9 as valid. If no more than 2 of the 9 ratings were outside the 3-point range that included the median (1-3, 4-6, or 7-9), the panelists were considered to be in agreement. If 3 or more of the panelists' ratings of a measure were within the 1 to 3 range and 3 or more ratings were in the 7 to 9 range, the panelists were considered to be in disagreement. There was agreement on 43% (19) of the measures as important, 27% (12) as scientifically sound, 48% (21) as usable, and 59% (26) as feasible to complete. Ten measures met all 4 of these criteria and were, therefore, considered valid measurements of quality. Quality measures that were developed address outcomes (patient-reported outcomes for assessment and improvement of function) and processes of care (utilization rates of imaging, antibiotics, occupational therapy, ultrasound, and operative treatment). The consortium developed 10 measures of hand surgery quality using a validated methodology. These measures merit further development. Quality measures can be used to evaluate the quality of care provided by physicians and health systems and can inform quality and value-based reimbursement models. Copyright © 2017 American Society for Surgery of the Hand. Published by Elsevier Inc. All rights reserved.
2013-01-01
Background The use of teams is a well-known approach in a variety of settings, including health care, in both developed and developing countries. Team performance is comprised of teamwork and task work, and ascertaining whether a team is performing as expected to achieve the desired outcome has rarely been done in health care settings in resource-limited countries. Measuring teamwork requires identifying dimensions of teamwork or processes that comprise the teamwork construct, while taskwork requires identifying specific team functions. Since 2008 a community-based project in rural Zambia has teamed community health workers (CHWs) and traditional birth attendants (TBAs), supported by Neighborhood Health Committees (NHCs), to provide essential newborn and continuous curative care for children 0–59 months. This paper describes the process of developing a measure of teamwork and taskwork for community-based health teams in rural Zambia. Methods Six group discussions and pile-sorting sessions were conducted with three NHCs and three groups of CHW-TBA teams. Each session comprised six individuals. Results We selected 17 factors identified by participants as relevant for measuring teamwork in this rural setting. Participants endorsed seven functions as important to measure taskwork. To explain team performance, we assigned 20 factors into three sub-groups: personal, community-related and service-related. Conclusion Community and culturally relevant processes, functions and factors were used to develop a tool for measuring teamwork and taskwork in this rural community and the tool was quite unique from tools used in developed countries. PMID:23802766
Methods of measurement for semiconductor materials, process control, and devices
NASA Technical Reports Server (NTRS)
Bullis, W. M. (Editor)
1971-01-01
The development of methods of measurement for semiconductor materials, process control, and devices is discussed. The following subjects are also presented: (1) demonstration of the high sensitivity of the infrared response technique by the identification of gold in a germanium diode, (2) verification that transient thermal response is significantly more sensitive to the presence of voids in die attachment than steady-state thermal resistance, and (3) development of equipment for determining susceptibility of transistors to hot spot formation by the current-gain technique.
Developing a Science Process Skills Test for Secondary Students: Validity and Reliability Study
ERIC Educational Resources Information Center
Feyzioglu, Burak; Demirdag, Baris; Akyildiz, Murat; Altun, Eralp
2012-01-01
Science process skills are claimed to enable an individual to improve their own life visions and give a scientific view/literacy as a standard of their understanding about the nature of science. The main purpose of this study was to develop a test for measuring a valid, reliable and practical test for Science Process Skills (SPS) in secondary…
ERIC Educational Resources Information Center
Kentucky State Dept. of Libraries, Frankfort.
This document is the beginning of a process. The objects of the process are to improve decisions between alternate choices in the development of statewide library services. Secondary functions are to develop the tools for providing information relevant to decisions, to measure and monitor services, and to aid in the communication process. The…
Medendorp, Joseph; Bric, John; Connelly, Greg; Tolton, Kelly; Warman, Martin
2015-08-10
The purpose of this manuscript is to present the intended use and long-term maintenance strategy of an online laser diffraction particle size method used for process control in a spray drying process. A Malvern Insitec was used for online particle size measurements and a Malvern Mastersizer was used for offline particle size measurements. The two methods were developed in parallel with the Mastersizer serving as the reference method. Despite extensive method development across a range of particle sizes, the two instruments demonstrated different sensitivities to material and process changes over the product lifecycle. This paper will describe the procedure used to ensure consistent alignment of the two methods, thus allowing for continued use of online real-time laser diffraction as a surrogate for the offline system over the product lifecycle. Copyright © 2015 Elsevier B.V. All rights reserved.
Process auditing in long term care facilities.
Hewitt, S M; LeSage, J; Roberts, K L; Ellor, J R
1985-01-01
The ECC tool development and audit experiences indicated that there is promise in developing a process audit tool to monitor quality of care in nursing homes; moreover, the tool selected required only one hour per resident. Focusing on the care process and resident needs provided useful information for care providers at the unit level as well as for administrative personnel. Besides incorporating a more interdisciplinary focus, the revised tool needs to define support services most appropriate for nursing homes, includes items related to discharge planning and increases measurement of significant others' involvement in the care process. Future emphasis at the ECC will focus on developing intervention plans to maintain strengths and correct deficiencies identified in the audits. Various strategies to bring about desired changes in the quality of care will be evaluated through regular, periodic monitoring. Having a valid and reliable measure of quality of care as a tool will be an important step forward for LTC facilities.
Hierarchical, Three-Dimensional Measurement System for Crime Scene Scanning.
Marcin, Adamczyk; Maciej, Sieniło; Robert, Sitnik; Adam, Woźniak
2017-07-01
We present a new generation of three-dimensional (3D) measuring systems, developed for the process of crime scene documentation. This measuring system facilitates the preparation of more insightful, complete, and objective documentation for crime scenes. Our system reflects the actual requirements for hierarchical documentation, and it consists of three independent 3D scanners: a laser scanner for overall measurements, a situational structured light scanner for more minute measurements, and a detailed structured light scanner for the most detailed parts of tscene. Each scanner has its own spatial resolution, of 2.0, 0.3, and 0.05 mm, respectively. The results of interviews we have conducted with technicians indicate that our developed 3D measuring system has significant potential to become a useful tool for forensic technicians. To ensure the maximum compatibility of our measuring system with the standards that regulate the documentation process, we have also performed a metrological validation and designated the maximum permissible length measurement error E MPE for each structured light scanner. In this study, we present additional results regarding documentation processes conducted during crime scene inspections and a training session. © 2017 American Academy of Forensic Sciences.
An Imaging System for Automated Characteristic Length Measurement of Debrisat Fragments
NASA Technical Reports Server (NTRS)
Moraguez, Mathew; Patankar, Kunal; Fitz-Coy, Norman; Liou, J.-C.; Sorge, Marlon; Cowardin, Heather; Opiela, John; Krisko, Paula H.
2015-01-01
The debris fragments generated by DebriSat's hypervelocity impact test are currently being processed and characterized through an effort of NASA and USAF. The debris characteristics will be used to update satellite breakup models. In particular, the physical dimensions of the debris fragments must be measured to provide characteristic lengths for use in these models. Calipers and commercial 3D scanners were considered as measurement options, but an automated imaging system was ultimately developed to measure debris fragments. By automating the entire process, the measurement results are made repeatable and the human factor associated with calipers and 3D scanning is eliminated. Unlike using calipers to measure, the imaging system obtains non-contact measurements to avoid damaging delicate fragments. Furthermore, this fully automated measurement system minimizes fragment handling, which reduces the potential for fragment damage during the characterization process. In addition, the imaging system reduces the time required to determine the characteristic length of the debris fragment. In this way, the imaging system can measure the tens of thousands of DebriSat fragments at a rate of about six minutes per fragment, compared to hours per fragment in NASA's current 3D scanning measurement approach. The imaging system utilizes a space carving algorithm to generate a 3D point cloud of the article being measured and a custom developed algorithm then extracts the characteristic length from the point cloud. This paper describes the measurement process, results, challenges, and future work of the imaging system used for automated characteristic length measurement of DebriSat fragments.
Timmings, Caitlyn; Khan, Sobia; Moore, Julia E; Marquez, Christine; Pyka, Kasha; Straus, Sharon E
2016-02-24
To address challenges related to selecting a valid, reliable, and appropriate readiness assessment measure in practice, we developed an online decision support tool to aid frontline implementers in healthcare settings in this process. The focus of this paper is to describe a multi-step, end-user driven approach to developing this tool for use during the planning stages of implementation. A multi-phase, end-user driven approach was used to develop and test the usability of a readiness decision support tool. First, readiness assessment measures that are valid, reliable, and appropriate for healthcare settings were identified from a systematic review. Second, a mapping exercise was performed to categorize individual items of included measures according to key readiness constructs from an existing framework. Third, a modified Delphi process was used to collect stakeholder ratings of the included measures on domains of feasibility, relevance, and likelihood to recommend. Fourth, two versions of a decision support tool prototype were developed and evaluated for usability. Nine valid and reliable readiness assessment measures were included in the decision support tool. The mapping exercise revealed that of the nine measures, most measures (78 %) focused on assessing readiness for change at the organizational versus the individual level, and that four measures (44 %) represented all constructs of organizational readiness. During the modified Delphi process, stakeholders rated most measures as feasible and relevant for use in practice, and reported that they would be likely to recommend use of most measures. Using data from the mapping exercise and stakeholder panel, an algorithm was developed to link users to a measure based on characteristics of their organizational setting and their readiness for change assessment priorities. Usability testing yielded recommendations that were used to refine the Ready, Set, Change! decision support tool . Ready, Set, Change! decision support tool is an implementation support that is designed to facilitate the routine incorporation of a readiness assessment as an early step in implementation. Use of this tool in practice may offer time and resource-saving implications for implementation.
EDITORIAL: Industrial Process Tomography
NASA Astrophysics Data System (ADS)
Anton Johansen, Geir; Wang, Mi
2008-09-01
There has been tremendous development within measurement science and technology over the past couple of decades. New sensor technologies and compact versatile signal recovery electronics are continuously expanding the limits of what can be measured and the accuracy with which this can be done. Miniaturization of sensors and the use of nanotechnology push these limits further. Also, thanks to powerful and cost-effective computer systems, sophisticated measurement and reconstruction algorithms previously only accessible in advanced laboratories are now available for in situ online measurement systems. The process industries increasingly require more process-related information, motivated by key issues such as improved process control, process utilization and process yields, ultimately driven by cost-effectiveness, quality assurance, environmental and safety demands. Industrial process tomography methods have taken advantage of the general progress in measurement science, and aim at providing more information, both quantitatively and qualitatively, on multiphase systems and their dynamics. The typical approach for such systems has been to carry out one local or bulk measurement and assume that this is representative of the whole system. In some cases, this is sufficient. However, there are many complex systems where the component distribution varies continuously and often unpredictably in space and time. The foundation of industrial tomography is to conduct several measurements around the periphery of a multiphase process, and use these measurements to unravel the cross-sectional distribution of the process components in time and space. This information is used in the design and optimization of industrial processes and process equipment, and also to improve the accuracy of multiphase system measurements in general. In this issue we are proud to present a selection of the 145 papers presented at the 5th World Congress on Industrial Process Tomography in Bergen, September 2007. Interestingly, x-ray technologies, one of the first imaging modalities available, keep on moving the limits on both spatial and temporal measurement resolution; experimental results of less than 100 nm and several thousand frames/s are reported, respectively. Important progress is demonstrated in research and development on sensor technologies and algorithms for data processing and image reconstruction, including unconventional sensor design and adaptation of the sensors to the application in question. The number of applications to which tomographic methods are applied is steadily increasing, and results obtained in a representative selection of applications are included. As guest editors we would like express our appreciation and thanks to all authors who have contributed and to IOP staff for excellent collaboration in the process of finalizing this special feature.
NASA Technical Reports Server (NTRS)
Loos, Alfred C.; Macrae, John D.; Hammond, Vincent H.; Kranbuehl, David E.; Hart, Sean M.; Hasko, Gregory H.; Markus, Alan M.
1993-01-01
A two-dimensional model of the resin transfer molding (RTM) process was developed which can be used to simulate the infiltration of resin into an anisotropic fibrous preform. Frequency dependent electromagnetic sensing (FDEMS) has been developed for in situ monitoring of the RTM process. Flow visualization tests were performed to obtain data which can be used to verify the sensor measurements and the model predictions. Results of the tests showed that FDEMS can accurately detect the position of the resin flow-front during mold filling, and that the model predicted flow-front patterns agreed well with the measured flow-front patterns.
Development and Assessment of a Medication Safety Measurement Program in a Long-Term Care Pharmacy.
Hertig, John B; Hultgren, Kyle E; Parks, Scott; Rondinelli, Rick
2016-02-01
Medication errors continue to be a major issue in the health care system, including in long-term care facilities. While many hospitals and health systems have developed methods to identify, track, and prevent these errors, long-term care facilities historically have not invested in these error-prevention strategies. The objective of this study was two-fold: 1) to develop a set of medication-safety process measures for dispensing in a long-term care pharmacy, and 2) to analyze the data from those measures to determine the relative safety of the process. The study was conducted at In Touch Pharmaceuticals in Valparaiso, Indiana. To assess the safety of the medication-use system, each step was documented using a comprehensive flowchart (process flow map) tool. Once completed and validated, the flowchart was used to complete a "failure modes and effects analysis" (FMEA) identifying ways a process may fail. Operational gaps found during FMEA were used to identify points of measurement. The research identified a set of eight measures as potential areas of failure; data were then collected on each one of these. More than 133,000 medication doses (opportunities for errors) were included in the study during the research time frame (April 1, 2014, and ended on June 4, 2014). Overall, there was an approximate order-entry error rate of 15.26%, with intravenous errors at 0.37%. A total of 21 errors migrated through the entire medication-use system. These 21 errors in 133,000 opportunities resulted in a final check error rate of 0.015%. A comprehensive medication-safety measurement program was designed and assessed. This study demonstrated the ability to detect medication errors in a long-term pharmacy setting, thereby making process improvements measureable. Future, larger, multi-site studies should be completed to test this measurement program.
Development of the Clinical Teaching Effectiveness Questionnaire in the United States.
Wormley, Michelle E; Romney, Wendy; Greer, Anna E
2017-01-01
The purpose of this study was to develop a valid measure for assessing clinical teaching effectiveness within the field of physical therapy. The Clinical Teaching Effectiveness Questionnaire (CTEQ) was developed via a 4-stage process, including (1) initial content development, (2) content analysis with 8 clinical instructors with over 5 years of clinical teaching experience, (3) pilot testing with 205 clinical instructors from 2 universities in the Northeast of the United States, and (4) psychometric evaluation, including principal component analysis. The scale development process resulted in a 30-item questionnaire with 4 sections that relate to clinical teaching: learning experiences, learning environment, communication, and evaluation. The CTEQ provides a preliminary valid measure for assessing clinical teaching effectiveness in physical therapy practice.
NASA Technical Reports Server (NTRS)
Otugen, M. Volkan
1997-01-01
Non-intrusive techniques for the dynamic measurement of gas flow properties such as density, temperature and velocity, are needed in the research leading to the development of new generation high-speed aircraft. Accurate velocity, temperature and density data obtained in ground testing and in-flight measurements can help understand the flow physics leading to transition and turbulence in supersonic, high-altitude flight. Such non-intrusive measurement techniques can also be used to study combustion processes of hydrocarbon fuels in aircraft engines. Reliable, time and space resolved temperature measurements in various combustor configurations can lead to a better understanding of high temperature chemical reaction dynamics thus leading to improved modeling and better prediction of such flows. In view of this, a research program was initiated at Polytechnic University's Aerodynamics Laboratory with support from NASA Lewis Research Center through grants NAG3-1301 and NAG3-1690. The overall objective of this program has been to develop laser-based, non-contact, space- and time-resolved temperature and velocity measurement techniques. In the initial phase of the program a ND:YAG laser-based dual-line Rayleigh scattering technique was developed and tested for the accurate measurement of gas temperature in the presence of background laser glare. Effort was next directed towards the development of a filtered, spectrally-resolved Rayleigh/Mie scattering technique with the objective of developing an interferometric method for time-frozen velocity measurements in high-speed flows utilizing the uv line of an ND:YAG laser and an appropriate molecular absorption filter. This effort included both a search for an appropriate filter material for the 266 nm laser line and the development and testing of several image processing techniques for the fast processing of Fabry-Perot images for velocity and temperature information. Finally, work was also carried out for the development of a new laser-based strain-rate and vorticity technique for the time-resolved measurement of vorticity and strain-rates in turbulent flows.
The Use of Online Surveys to Measure Satisfaction in Job Training and Workforce Development
ERIC Educational Resources Information Center
Schmidt, Steve; Strachota, Elaine; Conceicao, Simone
2006-01-01
This paper examines two empirical studies that used online surveys to collect data to measure satisfaction in job training and workforce development. A description of each study, findings related to response rate, the processes used in online survey development and implementation, as well as recommendations for the future use of online surveys…
ERIC Educational Resources Information Center
Schoen, Robert C.; Bray, Wendy; Wolfe, Christopher; Tazaz, Amanda M.; Nielsen, Lynne
2017-01-01
This study reports on the development and field study of K-TEEM, a web-based assessment instrument designed to measure mathematical knowledge for teaching (MKT) at the early elementary level. The development process involved alignment with early elementary curriculum standards, expert review of items and scoring criteria, cognitive interviews with…
Dasgupta, Annwesa P.; Anderson, Trevor R.; Pelaez, Nancy J.
2016-01-01
Researchers, instructors, and funding bodies in biology education are unanimous about the importance of developing students’ competence in experimental design. Despite this, only limited measures are available for assessing such competence development, especially in the areas of molecular and cellular biology. Also, existing assessments do not measure how well students use standard symbolism to visualize biological experiments. We propose an assessment-design process that 1) provides background knowledge and questions for developers of new “experimentation assessments,” 2) elicits practices of representing experiments with conventional symbol systems, 3) determines how well the assessment reveals expert knowledge, and 4) determines how well the instrument exposes student knowledge and difficulties. To illustrate this process, we developed the Neuron Assessment and coded responses from a scientist and four undergraduate students using the Rubric for Experimental Design and the Concept-Reasoning Mode of representation (CRM) model. Some students demonstrated sound knowledge of concepts and representations. Other students demonstrated difficulty with depicting treatment and control group data or variability in experimental outcomes. Our process, which incorporates an authentic research situation that discriminates levels of visualization and experimentation abilities, shows potential for informing assessment design in other disciplines. PMID:27146159
NASA Technical Reports Server (NTRS)
Watson, Michael D.; Kelley, Gary W.
2012-01-01
The Department of Defense (DoD) defined System Operational Effectiveness (SOE) model provides an exceptional framework for an affordable approach to the development and operation of space launch vehicles and their supporting infrastructure. The SOE model provides a focal point from which to direct and measure technical effectiveness and process efficiencies of space launch vehicles. The application of the SOE model to a space launch vehicle's development and operation effort leads to very specific approaches and measures that require consideration during the design phase. This paper provides a mapping of the SOE model to the development of space launch vehicles for human exploration by addressing the SOE model key points of measurement including System Performance, System Availability, Technical Effectiveness, Process Efficiency, System Effectiveness, Life Cycle Cost, and Affordable Operational Effectiveness. In addition, the application of the SOE model to the launch vehicle development process is defined providing the unique aspects of space launch vehicle production and operations in lieu of the traditional broader SOE context that examines large quantities of fielded systems. The tailoring and application of the SOE model to space launch vehicles provides some key insights into the operational design drivers, capability phasing, and operational support systems.
Black, Stephanie Winkeljohn; Pössel, Patrick
2013-08-01
Adolescents who develop depression have worse interpersonal and affective experiences and are more likely to develop substance problems and/or suicidal ideation compared to adolescents who do not develop depression. This study examined the combined effects of negative self-referent information processing and rumination (i.e., brooding and reflection) on adolescent depressive symptoms. It was hypothesized that the interaction of negative self-referent information processing and brooding would significantly predict depressive symptoms, while the interaction of negative self-referent information processing and reflection would not predict depressive symptoms. Adolescents (n = 92; 13-15 years; 34.7% female) participated in a 6-month longitudinal study. Self-report instruments measured depressive symptoms and rumination; a cognitive task measured information processing. Path modelling in Amos 19.0 analyzed the data. The interaction of negative information processing and brooding significantly predicted an increase in depressive symptoms 6 months later. The interaction of negative information processing and reflection did not significantly predict depression, however, the model not meet a priori standards to accept the null hypothesis. Results suggest clinicians working with adolescents at-risk for depression should consider focusing on the reduction of brooding and negative information processing to reduce long-term depressive symptoms.
NASA Astrophysics Data System (ADS)
Yuksel, Onur; Baran, Ismet; Ersoy, Nuri; Akkerman, Remko
2018-05-01
Process induced stresses inherently exist in fiber reinforced polymer composites particularly in thick parts due to the presence of non-uniform cure, shrinkage and thermal expansion/contraction during manufacturing. In order to increase the reliability and the performance of the composite materials, process models are developed to predict the residual stress formation. The accuracy of the process models is dependent on the geometrical (micro to macro), material and process parameters as well as the numerical implementation. Therefore, in order to have reliable process modelling framework, there is a need for validation and if necessary calibration of the developed models. This study focuses on measurement of the transverse residual stresses in a relatively thick pultruded profile (20×20 mm) made of glass/polyester. Process-induced residual stresses in the middle of the profile are examined with different techniques which have never been applied for transverse residual stresses in thick unidirectional composites. Hole drilling method with strain gage and digital image correlation are employed. Strain values measured from measurements are used in a finite element model (FEM) to simulate the hole drilling process and predict the residual stress level. The measured released strain is found to be approximately 180 μm/m from the strain gage. The tensile residual stress at the core of the profile is estimated approximately as 7-10 MPa. Proposed methods and measured values in this study will enable validation and calibration of the process models based on the residual stresses.
The Valued Living Questionnaire: Defining and Measuring Valued Action within a Behavioral Framework
ERIC Educational Resources Information Center
Wilson, Kelly G.; Sandoz, Emily K.; Kitchens, Jennifer; Roberts, Miguel
2010-01-01
A number of cognitive-behavior therapies now strongly emphasize particular behavioral processes as mediators of clinical change specific to that therapy. This shift in emphasis calls for the development of measures sensitive to changes in the therapies' processes. Among these is acceptance and commitment therapy (ACT), which posits valued living…
ERIC Educational Resources Information Center
Visu-Petra, Laura; Miclea, Mircea; Cheie, Lavinia; Benga, Oana
2009-01-01
In self-paced auditory memory span tasks, the microanalysis of response timing measures represents a developmentally sensitive measure, providing insights into the development of distinct processing rates during recall performance. The current study first examined the effects of age and trait anxiety on span accuracy (effectiveness) and response…
Measurement of Learning Process by Semantic Annotation Technique on Bloom's Taxonomy Vocabulary
ERIC Educational Resources Information Center
Yanchinda, Jirawit; Yodmongkol, Pitipong; Chakpitak, Nopasit
2016-01-01
A lack of science and technology knowledge understanding of most rural people who had the highest education at elementary education level more than others level is unsuccessfully transferred appropriate technology knowledge for rural sustainable development. This study provides the measurement of the learning process by on Bloom's Taxonomy…
3D interconnect metrology in CMS/ITRI
NASA Astrophysics Data System (ADS)
Ku, Y. S.; Shyu, D. M.; Hsu, W. T.; Chang, P. Y.; Chen, Y. C.; Pang, H. L.
2011-05-01
Semiconductor device packaging technology is rapidly advancing, in response to the demand for thinner and smaller electronic devices. Three-dimensional chip/wafer stacking that uses through-silicon vias (TSV) is a key technical focus area, and the continuous development of this novel technology has created a need for non-contact characterization. Many of these challenges are novel to the industry due to the relatively large variety of via sizes and density, and new processes such as wafer thinning and stacked wafer bonding. This paper summarizes the developing metrology that has been used during via-middle & via-last TSV process development at EOL/ITRI. While there is a variety of metrology and inspection applications for 3D interconnect processing, the main topics covered here are via CD/depth measurement, thinned wafer inspection and wafer warpage measurement.
2003-06-26
VANDENBERG AIR FORCE BASE, CALIF. - At Vandenberg Air Force Base, Calif., the Pegasus launch vehicle is moved toward its hangar. The Pegasus will carry the SciSat-1 spacecraft in a 400-mile-high polar orbit to investigate processes that control the distribution of ozone in the upper atmosphere. The data from the satellite will provide Canadian and international scientists with improved measurements relating to global ozone processes and help policymakers assess existing environmental policy and develop protective measures for improving the health of our atmosphere, preventing further ozone depletion. The mission is designed to last two years.
NASA Astrophysics Data System (ADS)
Sari, Anggi Ristiyana Puspita; Suyanta, LFX, Endang Widjajanti; Rohaeti, Eli
2017-05-01
Recognizing the importance of the development of critical thinking and science process skills, the instrument should give attention to the characteristics of chemistry. Therefore, constructing an accurate instrument for measuring those skills is important. However, the integrated instrument assessment is limited in number. The purpose of this study is to validate an integrated assessment instrument for measuring students' critical thinking and science process skills on acid base matter. The development model of the test instrument adapted McIntire model. The sample consisted of 392 second grade high school students in the academic year of 2015/2016 in Yogyakarta. Exploratory Factor Analysis (EFA) was conducted to explore construct validity, whereas content validity was substantiated by Aiken's formula. The result shows that the KMO test is 0.714 which indicates sufficient items for each factor and the Bartlett test is significant (a significance value of less than 0.05). Furthermore, content validity coefficient which is based on 8 experts is obtained at 0.85. The findings support the integrated assessment instrument to measure critical thinking and science process skills on acid base matter.
Kazis, Lewis E; Sheridan, Robert L; Shapiro, Gabriel D; Lee, Austin F; Liang, Matthew H; Ryan, Colleen M; Schneider, Jeffrey C; Lydon, Martha; Soley-Bori, Marina; Sonis, Lily A; Dore, Emily C; Palmieri, Tina; Herndon, David; Meyer, Walter; Warner, Petra; Kagan, Richard; Stoddard, Frederick J; Murphy, Michael; Tompkins, Ronald G
2018-04-01
There has been little systematic examination of variation in pediatric burn care clinical practices and its effect on outcomes. As a first step, current clinical care processes need to be operationally defined. The highly specialized burn care units of the Shriners Hospitals for Children system present an opportunity to describe the processes of care. The aim of this study was to develop a set of process-based measures for pediatric burn care and examine adherence to them by providers in a cohort of pediatric burn patients. We conducted a systematic literature review to compile a set of process-based indicators. These measures were refined by an expert panel of burn care providers, yielding 36 process-based indicators in four clinical areas: initial evaluation and resuscitation, acute excisional surgery and critical care, psychosocial and pain control, and reconstruction and aftercare. We assessed variability in adherence to the indicators in a cohort of 1,076 children with burns at four regional pediatric burn programs in the Shriners Hospital system. The percentages of the cohort at each of the four sites were as follows: Boston, 20.8%; Cincinnati, 21.1%; Galveston, 36.0%; and Sacramento, 22.1%. The cohort included children who received care between 2006 and 2010. Adherence to the process indicators varied both across sites and by clinical area. Adherence was lowest for the clinical areas of acute excisional surgery and critical care, with a range of 35% to 48% across sites, followed by initial evaluation and resuscitation (range, 34%-60%). In contrast, the clinical areas of psychosocial and pain control and reconstruction and aftercare had relatively high adherence across sites, with ranges of 62% to 93% and 71% to 87%, respectively. Of the 36 process indicators, 89% differed significantly in adherence between clinical sites (p < 0.05). Acute excisional surgery and critical care exhibited the most variability. The development of this set of process-based measures represents an important step in the assessment of clinical practice in pediatric burn care. Substantial variation was observed in practices of pediatric burn care. However, further research is needed to link these process-based measures to clinical outcomes. Therapeutic/care management, level IV.
Exploring the role of auditory analysis in atypical compared to typical language development.
Grube, Manon; Cooper, Freya E; Kumar, Sukhbinder; Kelly, Tom; Griffiths, Timothy D
2014-02-01
The relationship between auditory processing and language skills has been debated for decades. Previous findings have been inconsistent, both in typically developing and impaired subjects, including those with dyslexia or specific language impairment. Whether correlations between auditory and language skills are consistent between different populations has hardly been addressed at all. The present work presents an exploratory approach of testing for patterns of correlations in a range of measures of auditory processing. In a recent study, we reported findings from a large cohort of eleven-year olds on a range of auditory measures and the data supported a specific role for the processing of short sequences in pitch and time in typical language development. Here we tested whether a group of individuals with dyslexic traits (DT group; n = 28) from the same year group would show the same pattern of correlations between auditory and language skills as the typically developing group (TD group; n = 173). Regarding the raw scores, the DT group showed a significantly poorer performance on the language but not the auditory measures, including measures of pitch, time and rhythm, and timbre (modulation). In terms of correlations, there was a tendency to decrease in correlations between short-sequence processing and language skills, contrasted by a significant increase in correlation for basic, single-sound processing, in particular in the domain of modulation. The data support the notion that the fundamental relationship between auditory and language skills might differ in atypical compared to typical language development, with the implication that merging data or drawing inference between populations might be problematic. Further examination of the relationship between both basic sound feature analysis and music-like sound analysis and language skills in impaired populations might allow the development of appropriate training strategies. These might include types of musical training to augment language skills via their common bases in sound sequence analysis. Copyright © 2013 The Authors. Published by Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Martinez, Gabrielle L; Magnotti, Gina M; Knox, Benjamin W
Quantitative measurements of the primary breakup process in diesel sprays are lacking due to a range of experimental and diagnostic challenges, including: high droplet number density environments, very small characteristic drop size scales (~1-10 μm), and high characteristic velocities in the primary breakup region (~600 m/s). Due to these challenges, existing measurement techniques have failed to resolve a sufficient range of the temporal and spatial scales involved and much remains unknown about the primary atomization process in practical diesel sprays. To gain a better insight into this process, we have developed a joint visible and x-ray extinction measurement technique tomore » quantify axial and radial distributions of the path-integrated Sauter Mean Diameter (SMD) and Liquid Volume Fraction (LVF) for diesel-like sprays. This technique enables measurement of the SMD in regions of moderate droplet number density, enabling construction of the temporal history of drop size development within practical diesel sprays. The experimental campaign was conducted jointly at the Georgia Institute of Technology and Argonne National Laboratory using the Engine Combustion Network “Spray D” injector. X-ray radiography liquid absorption measurements, conducted at the Advanced Photon Source at Argonne, quantify the liquid-fuel mass and volume distribution in the spray. Diffused back-illumination liquid scattering measurements were conducted at Georgia Tech to quantify the optical thickness throughout the spray. By application of Mie-scatter equations, the ratio of the absorption and scattering extinction measurements is demonstrated to yield solutions for the SMD. This work introduces the newly developed scattering-absorption measurement technique and highlights the important considerations that must be taken into account when jointly processing these measurements to extract the SMD. These considerations include co-alignment of measurements taken at different institutions, identification of viable regions where the measurement ratio can be accurately interpreted, and uncertainty analysis in the measurement ratio and resulting SMD. Because the measurement technique provides the spatial history of the SMD development, it is expected to be especially informative to the diesel spray modeling community. Results from this work will aid in understanding the effect of ambient densities and injection pressures on primary breakup and help assess the appropriateness of spray submodels for engine computational fluid dynamics codes.« less
The TAME Project: Towards improvement-oriented software environments
NASA Technical Reports Server (NTRS)
Basili, Victor R.; Rombach, H. Dieter
1988-01-01
Experience from a dozen years of analyzing software engineering processes and products is summarized as a set of software engineering and measurement principles that argue for software engineering process models that integrate sound planning and analysis into the construction process. In the TAME (Tailoring A Measurement Environment) project at the University of Maryland, such an improvement-oriented software engineering process model was developed that uses the goal/question/metric paradigm to integrate the constructive and analytic aspects of software development. The model provides a mechanism for formalizing the characterization and planning tasks, controlling and improving projects based on quantitative analysis, learning in a deeper and more systematic way about the software process and product, and feeding the appropriate experience back into the current and future projects. The TAME system is an instantiation of the TAME software engineering process model as an ISEE (integrated software engineering environment). The first in a series of TAME system prototypes has been developed. An assessment of experience with this first limited prototype is presented including a reassessment of its initial architecture.
Total body calcium analysis using the Ca-12(n, alpha) Ar-37 reaction
NASA Technical Reports Server (NTRS)
Lewellen, T. K.; Nelp, W. B.
1977-01-01
A low dose neutron activation technique was developed to measure total body calcium in vivo. The effort had included development of irradiation and processing facilities and conduction of human studies to determine the accuracy and precision of measurement attainable with the systems.
Measuring housing quality in the absence of a monetized real estate market.
Rindfuss, Ronald R; Piotrowski, Martin; Thongthai, Varachai; Prasartkul, Pramote
2007-03-01
Measuring housing quality or value or both has been a weak component of demographic and development research in less developed countries that lack an active real estate (housing) market. We describe a new method based on a standardized subjective rating process. It is designed to be used in settings that do not have an active, monetized housing market. The method is applied in an ongoing longitudinal study in north-east Thailand and could be straightforwardly used in many other settings. We develop a conceptual model of the process whereby households come to reside in high-quality or low-quality housing units. We use this theoretical model in conjunction with longitudinal data to show that the new method of measuring housing quality behaves as theoretically expected, thus providing evidence of face validity.
NASA Astrophysics Data System (ADS)
Astuti, Sri Rejeki Dwi; Suyanta, LFX, Endang Widjajanti; Rohaeti, Eli
2017-05-01
The demanding of assessment in learning process was impact by policy changes. Nowadays, assessment is not only emphasizing knowledge, but also skills and attitudes. However, in reality there are many obstacles in measuring them. This paper aimed to describe how to develop integrated assessment instrument and to verify instruments' validity such as content validity and construct validity. This instrument development used test development model by McIntire. Development process data was acquired based on development test step. Initial product was observed by three peer reviewer and six expert judgments (two subject matter experts, two evaluation experts and two chemistry teachers) to acquire content validity. This research involved 376 first grade students of two Senior High Schools in Bantul Regency to acquire construct validity. Content validity was analyzed used Aiken's formula. The verifying of construct validity was analyzed by exploratory factor analysis using SPSS ver 16.0. The result show that all constructs in integrated assessment instrument are asserted valid according to content validity and construct validity. Therefore, the integrated assessment instrument is suitable for measuring critical thinking abilities and science process skills of senior high school students on electrolyte solution matter.
How Much Popcorn Will Our Classroom Hold?
ERIC Educational Resources Information Center
Rommel-Esham, Katie
2007-01-01
"How much popcorn will our classroom hold?" This intriguing question sparked a terrific integrated science and math exploration that the author conducted with fifth-and sixth-grade students. In the process of finding the classroom's volume, students developed science-process skills (e.g., developing a plan, measurement, collecting and interpreting…
Development of The Science Processes Test.
ERIC Educational Resources Information Center
Ludeman, Robert R.
Presented is a description and copy of a test manual developed to include items in the test on the basis of children's performance; each item correlated highly with performance on an external criterion. The external criterion was the Individual Competency Measures of the elementary science program Science - A Process Approach (SAPA). The test…
Monitoring cure of composite resins using frequency dependent electromagnetic sensing techniques
NASA Technical Reports Server (NTRS)
Kranbuehl, D. E.; Hoff, M. S.; Loos, A. C.; Freeman, W. T., Jr.; Eichinger, D. A.
1988-01-01
A nondestructive in situ measurement technique has been developed for monitoring and measuring the cure processing properties of composite resins. Frequency dependent electromagnetic sensors (FDEMS) were used to directly measure resin viscosity during cure. The effects of the cure cycle and resin aging on the viscosity during cure were investigated using the sensor. Viscosity measurements obtained using the sensor are compared with the viscosities calculated by the Loos-Springer cure process model. Good overall agreement was obtained except for the aged resin samples.
2005-07-01
approach for measuring the return on Information Technology (IT) investments. A review of existing methods suggests the difficulty in adequately...measuring the returns of IT at various levels of analysis (e.g., firm or process level). To address this issue, this study aims to develop a method for...view (KBV), this paper proposes an analytic method for measuring the historical revenue and cost of IT investments by estimating the amount of
Laser triangulation method for measuring the size of parking claw
NASA Astrophysics Data System (ADS)
Liu, Bo; Zhang, Ming; Pang, Ying
2017-10-01
With the development of science and technology and the maturity of measurement technology, the 3D profile measurement technology has been developed rapidly. Three dimensional measurement technology is widely used in mold manufacturing, industrial inspection, automatic processing and manufacturing, etc. There are many kinds of situations in scientific research and industrial production. It is necessary to transform the original mechanical parts into the 3D data model on the computer quickly and accurately. At present, many methods have been developed to measure the contour size, laser triangulation is one of the most widely used methods.
ERIC Educational Resources Information Center
Robinson, Anna; Elliott, Robert
2016-01-01
People with autism spectrum disorder (ASD), can have difficulties in emotion processing, including recognising their own and others' emotions, leading to problems in emotion regulation and interpersonal relating. This study reports the development and piloting of the Client Emotional Processing Scale-Autism Spectrum (CEPS-AS), a new observer…
Textile composite processing science
NASA Technical Reports Server (NTRS)
Loos, Alfred C.; Hammond, Vincent H.; Kranbuehl, David E.; Hasko, Gregory H.
1993-01-01
A multi-dimensional model of the Resin Transfer Molding (RTM) process was developed for the prediction of the infiltration behavior of a resin into an anisotropic fiber preform. Frequency dependent electromagnetic sensing (FDEMS) was developed for in-situ monitoring of the RTM process. Flow visualization and mold filling experiments were conducted to verify sensor measurements and model predictions. Test results indicated good agreement between model predictions, sensor readings, and experimental data.
San Juan, Valerie; Astington, Janet Wilde
2012-03-01
Recent advancements in the field of infant false-belief reasoning have brought into question whether performance on implicit and explicit measures of false belief is driven by the same level of representational understanding. The success of infants on implicit measures has also raised doubt over the role that language development plays in the development of false-belief reasoning. In the current paper, we argue that children's performance on disparate measures cannot be used to infer similarities in understanding across different age groups. Instead, we argue that development must continue to occur between the periods when children can reason implicitly and then explicitly about false belief. We then propose mechanisms by which language associated with false-belief tasks facilitates this transition by assisting with both the processes of elicited response selection and the formation of metarepresentational understanding. © 2011 The British Psychological Society.
NASA Technical Reports Server (NTRS)
Bose, Deepak
2012-01-01
The design of entry vehicles requires predictions of aerothermal environment during the hypersonic phase of their flight trajectories. These predictions are made using computational fluid dynamics (CFD) codes that often rely on physics and chemistry models of nonequilibrium processes. The primary processes of interest are gas phase chemistry, internal energy relaxation, electronic excitation, nonequilibrium emission and absorption of radiation, and gas-surface interaction leading to surface recession and catalytic recombination. NASAs Hypersonics Project is advancing the state-of-the-art in modeling of nonequilibrium phenomena by making detailed spectroscopic measurements in shock tube and arcjets, using ab-initio quantum mechanical techniques develop fundamental chemistry and spectroscopic databases, making fundamental measurements of finite-rate gas surface interactions, implementing of detailed mechanisms in the state-of-the-art CFD codes, The development of new models is based on validation with relevant experiments. We will present the latest developments and a roadmap for the technical areas mentioned above
Russo, Natalie; Flanagan, Tara; Iarocci, Grace; Berringer, Darlene; Zelazo, Philip David; Burack, Jacob A
2007-10-01
Individuals with autism demonstrate impairments on measures of executive function (EF) relative to typically developing comparison participants. EF is comprised of several processes including inhibition, working memory and set shifting that develop throughout the lifespan. Impairments in EF may appear early in development and persist, or may represent a more transient delay which resolves with time. Given the unevenness of the cognitive profile of persons with autism, understanding the development of EF poses methodological challenges. These issues include those related to matching measures and the choice of comparison participants to which the performance of persons with autism will be compared. In the current review, we attempt to break down the processes of inhibition, working memory and set shifting among persons with autism. We propose to do this within a developmental perspective that highlights how matching measures and comparison participants can affect the interpretation of research findings.
Evaluation and development plan of NRTA measurement methods for the Rokkasho Reprocessing Plant
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, T.K.; Hakkila, E.A.; Flosterbuer, S.F.
Near-real-time accounting (NRTA) has been proposed as a safeguards method at the Rokkasho Reprocessing Plant (RRP), a large-scale commercial boiling water and pressurized water reactors spent-fuel reprocessing facility. NRTA for RRP requires material balance closures every month. To develop a more effective and practical NRTA system for RRP, we have evaluated NRTA measurement techniques and systems that might be implemented in both the main process and the co-denitration process areas at RRP to analyze the concentrations of plutonium in solutions and mixed oxide powder. Based on the comparative evaluation, including performance, reliability, design criteria, operation methods, maintenance requirements, and estimatedmore » costs for each possible measurement method, recommendations for development were formulated. This paper discusses the evaluations and reports on the recommendation of the NRTA development plan for potential implementation at RRP.« less
Bowen, Judith L; Stevens, David P; Sixta, Connie S; Provost, Lloyd; Johnson, Julie K; Woods, Donna M; Wagner, Edward H
2010-09-01
The Chronic Care Model (CCM) is a multidimensional framework designed to improve care for patients with chronic health conditions. The model strives for productive interactions between informed, activated patients and proactive practice teams, resulting in better clinical outcomes and greater satisfaction. While measures for improving care may be clear, measures of residents' competency to provide chronic care do not exist. This report describes the process used to develop educational measures and results from CCM settings that used them to monitor curricular innovations. Twenty-six academic health care teams participating in the national and California Academic Chronic Care Collaboratives. Using successive discussion groups and surveys, participants engaged in an iterative process to identify desirable and feasible educational measures for curricula that addressed educational objectives linked to the CCM. The measures were designed to facilitate residency programs' abilities to address new accreditation requirements and tested with teams actively engaged in redesigning educational programs. Field notes from each discussion and lists from work groups were synthesized using the CCM framework. Descriptive statistics were used to report survey results and measurement performance. Work groups generated educational objectives and 17 associated measurements. Seventeen (65%) teams provided feasibility and desirability ratings for the 17 measures. Two process measures were selected for use by all teams. Teams reported variable success using the measures. Several teams reported use of additional measures, suggesting more extensive curricular change. Using an iterative process in collaboration with program participants, we successfully defined a set of feasible and desirable education measures for academic health care teams using the CCM. These were used variably to measure the results of curricular changes, while simultaneously addressing requirements for residency accreditation.
Development of on-line laser power monitoring system
NASA Astrophysics Data System (ADS)
Ding, Chien-Fang; Lee, Meng-Shiou; Li, Kuan-Ming
2016-03-01
Since the laser was invented, laser has been applied in many fields such as material processing, communication, measurement, biomedical engineering, defense industries and etc. Laser power is an important parameter in laser material processing, i.e. laser cutting, and laser drilling. However, the laser power is easily affected by the environment temperature, we tend to monitor the laser power status, ensuring there is an effective material processing. Besides, the response time of current laser power meters is too long, they cannot measure laser power accurately in a short time. To be more precisely, we can know the status of laser power and help us to achieve an effective material processing at the same time. To monitor the laser power, this study utilize a CMOS (Complementary metal-oxide-semiconductor) camera to develop an on-line laser power monitoring system. The CMOS camera captures images of incident laser beam after it is split and attenuated by beam splitter and neutral density filter. By comparing the average brightness of the beam spots and measurement results from laser power meter, laser power can be estimated. Under continuous measuring mode, the average measuring error is about 3%, and the response time is at least 3.6 second shorter than thermopile power meters; under trigger measuring mode which enables the CMOS camera to synchronize with intermittent laser output, the average measuring error is less than 3%, and the shortest response time is 20 millisecond.
Accountability for the Quality of Care Provided to People with Serious Illness
Hudson Scholle, Sarah; Briefer French, Jessica
2018-01-01
Abstract Background: Care for patients with serious illness is an emerging practice area that has gained attention as value-based purchasing has increased. While the number of programs is growing, their impact on care quality and outcomes is unknown. Objective: With support from the Gordon and Betty Moore Foundation, the National Committee for Quality Assurance (NCQA) is assessing the feasibility of creating an accountability program focused on serious illness care. Methods: This article describes the process of developing an accountability program, findings from our initial work, and our plans to develop measures for a serious illness care accountability program. We focused on three questions: 1. What patient populations should be targeted for measurement?2. What entities have accountability for ensuring high-quality care for serious illness?3. What structures, processes, and outcomes should be evaluated in an accountability program for serious illness care? Results: Our environmental scan showed that the evidence base for specific patient populations or care models is not sufficiently mature to justify traditional structure and process measures. In visits to serious illness care programs, we observed different staffing models, care models, care settings, and payment structures. We found a gap between recommended inclusion criteria and services when compared to inclusion criteria and services offered by existing programs. Conclusions: To address the challenges, NCQA intends to develop outcome measures driven by patient and family priorities. Structure and process measures will focus on building organizations' capacity to measure outcomes, including patient engagement and outcomes, linked to patient goals. PMID:29313755
Speech Intelligibility Predicted from Neural Entrainment of the Speech Envelope.
Vanthornhout, Jonas; Decruy, Lien; Wouters, Jan; Simon, Jonathan Z; Francart, Tom
2018-04-01
Speech intelligibility is currently measured by scoring how well a person can identify a speech signal. The results of such behavioral measures reflect neural processing of the speech signal, but are also influenced by language processing, motivation, and memory. Very often, electrophysiological measures of hearing give insight in the neural processing of sound. However, in most methods, non-speech stimuli are used, making it hard to relate the results to behavioral measures of speech intelligibility. The use of natural running speech as a stimulus in electrophysiological measures of hearing is a paradigm shift which allows to bridge the gap between behavioral and electrophysiological measures. Here, by decoding the speech envelope from the electroencephalogram, and correlating it with the stimulus envelope, we demonstrate an electrophysiological measure of neural processing of running speech. We show that behaviorally measured speech intelligibility is strongly correlated with our electrophysiological measure. Our results pave the way towards an objective and automatic way of assessing neural processing of speech presented through auditory prostheses, reducing confounds such as attention and cognitive capabilities. We anticipate that our electrophysiological measure will allow better differential diagnosis of the auditory system, and will allow the development of closed-loop auditory prostheses that automatically adapt to individual users.
Development of a neutron measurement system in unified non-destructive assay for the PRIDE facility
NASA Astrophysics Data System (ADS)
Seo, Hee; Park, Se-Hwan; Won, Byung-Hee; Ahn, Seong-Kyu; Shin, Hee-Sung; Na, Sang-Ho; Song, Dae-Yong; Kim, Ho-Dong; Lee, Seung Kyu
2013-12-01
The Korea Atomic Energy Research Institute (KAERI) has made an effort to develop pyroprocessing technology to resolve an on-going problem in Korea, i.e., the management of spent nuclear fuels. To this end, a test-bed facility for pyroprocessing, called PRIDE (PyRoprocessing Integrated inactive DEmonstration facility), is being constructed at KAERI. The main objective of PRIDE is to evaluate the performance of the unit processes, remote operation, maintenance, and proliferation resistance. In addition, integrating all unit processes into a one-step process is also one of the main goals. PRIDE can also provide a good opportunity to test safeguards instrumentations for a pyroprocessing facility such as nuclear material accounting devices, surveillance systems, radiation monitoring systems, and process monitoring systems. In the present study, a non-destructive assay (NDA) system for the testing of nuclear material accountancy of PRIDE was designed by integrating three different NDA techniques, i.e., neutron, gamma-ray, and mass measurements. The developed neutron detection module consists of 56 3He tubes and 16 AMPTEK A111 signal processing circuits. The amplifiers were matched in terms of the gain and showed good uniformity after a gain-matching procedure (%RSD=0.37%). The axial and the radial efficiency distributions within the cavity were then measured using a 252Cf neutron source and were compared with the MCNPX calculation results. The measured efficiency distributions showed excellent agreement with the calculations, which confirmed the accuracy of the MCNPX model of the system.
Automated assembly of VECSEL components
NASA Astrophysics Data System (ADS)
Brecher, C.; Pyschny, N.; Haag, S.; Mueller, T.
2013-02-01
Due to the architectural advantage of an external cavity architecture that enables the integration of additional elements into the cavity (e.g. for mode control, frequency conversion, wavelength tuning or passive mode-locking) VECSELs are a rapidly developing laser technology. Nevertheless they often have to compete with direct (edge) emitting laser diodes which can have significant cost advantages thanks to their rather simple structure and production processes. One way to compensate the economical disadvantages of VECSELs is to optimize each component in terms of quality and costs and to apply more efficient (batch) production processes. In this context, the paper presents recent process developments for the automated assembly of VECSELs using a new type of desktop assembly station with an ultra-precise micromanipulator. The core concept is to create a dedicated process development environment from which implemented processes can be transferred fluently to production equipment. By now two types of processes have been put into operation on the desktop assembly station: 1.) passive alignment of the pump optics implementing a camera-based alignment process, where the pump spot geometry and position on the semiconductor chip is analyzed and evaluated; 2.) active alignment of the end mirror based on output power measurements and optimization algorithms. In addition to the core concept and corresponding hardware and software developments, detailed results of both processes are presented explaining measurement setups as well as alignment strategies and results.
Gaguski, Michele E; George, Kim; Bruce, Susan D; Brucker, Edie; Leija, Carol; LeFebvre, Kristine B; Mackey, Heather
2017-12-01
A project team was formulated to create evidence-based oncology nurse generalist competencies (ONGCs) to establish best practices in competency development, including high-risk tasks, critical thinking criteria, and measurement of key areas for oncology nurses. . This article aims to describe the process and the development of ONGCs. . This article explains how the ONGCs were accomplished, and includes outcomes and suggestions for use in clinical practice. . Institutions can use the ONGCs to assess and develop competency programs, offer educational strategies to measure proficiency, and establish processes to foster a workplace committed to mentoring and teaching future oncology nurses.
2003-06-27
KENNEDY SPACE CENTER, FLA. - Inside the hangar at Vandenberg Air Force Base, Calif., workers wait for the Pegasus launch vehicle to be moved inside. The Pegasus will carry the SciSat-1 spacecraft in a 400-mile-high polar orbit to investigate processes that control the distribution of ozone in the upper atmosphere. The scientific mission of SciSat-1 is to measure and understand the chemical processes that control the distribution of ozone in the Earth’s atmosphere, particularly at high altitudes. The data from the satellite will provide Canadian and international scientists with improved measurements relating to global ozone processes and help policymakers assess existing environmental policy and develop protective measures for improving the health of our atmosphere, preventing further ozone depletion. The mission is designed to last two years.
Scoping review of potential quality indicators for hip fracture patient care
Pitzul, Kristen B; Munce, Sarah E P; Perrier, Laure; Beaupre, Lauren; Morin, Suzanne N; McGlasson, Rhona; Jaglal, Susan B
2017-01-01
Objective The purpose of this study is to identify existing or potential quality of care indicators (ie, current indicators as well as process and outcome measures) in the acute or postacute period, or across the continuum of care for older adults with hip fracture. Design Scoping review. Setting All care settings. Search strategy English peer-reviewed studies published from January 2000 to January 2016 were included. Literature search strategies were developed, and the search was peer-reviewed. Two reviewers independently piloted all forms, and all articles were screened in duplicate. Results The search yielded 2729 unique articles, of which 302 articles were included (11.1%). When indicators (eg, in-hospital mortality, acute care length of stay) and potential indicators (eg, comorbidities developed in hospital, walking ability) were grouped by the outcome or process construct they were trying to measure, the most common constructs were measures of mortality (outcome), length of stay (process) and time-sensitive measures (process). There was heterogeneity in definitions within constructs between studies. There was also a paucity of indicators and potential indicators in the postacute period. Conclusions To improve quality of care for patients with hip fracture and create a more efficient healthcare system, mechanisms for the measurement of quality of care across the entire continuum, not just during the acute period, are required. Future research should focus on decreasing the heterogeneity in definitions of quality indicators and the development and implementation of quality indicators for the postacute period. PMID:28325859
Methods of measurement for semiconductor materials, process control, and devices
NASA Technical Reports Server (NTRS)
Bullis, W. M. (Editor)
1972-01-01
Activities directed toward the development of methods of measurement for semiconductor materials, process control, and devices are described. Accomplishments include the determination of the reasons for differences in measurements of transistor delay time, identification of an energy level model for gold-doped silicon, and the finding of evidence that it does not appear to be necessary for an ultrasonic bonding tool to grip the wire and move it across the substrate metallization to make the bond. Work is continuing on measurement of resistivity of semiconductor crystals; study of gold-doped silicon; development of the infrared response technique; evaluation of wire bonds and die attachment; measurement of thermal properties of semiconductor devices, delay time, and related carrier transport properties in junction devices, and noise properties of microwave diodes; and characterization of silicon nuclear radiation detectors.
Global Precipitation Measurement: GPM Microwave Imager (GMI) Algorithm Development Approach
NASA Technical Reports Server (NTRS)
Stocker, Erich Franz
2009-01-01
This slide presentation reviews the approach to the development of the Global Precipitation Measurement algorithm. This presentation includes information about the responsibilities for the development of the algorithm, and the calibration. Also included is information about the orbit, and the sun angle. The test of the algorithm code will be done with synthetic data generated from the Precipitation Processing System (PPS).
Seed Cotton Mass Flow Measurement in the Gin
USDA-ARS?s Scientific Manuscript database
Seed cotton mass flow measurement is necessary for the development of improved gin process control systems that can increase gin efficiency and improve fiber quality. Previous studies led to the development of a seed cotton mass flow rate sensor based on the static pressure drop across the blowbox, ...
The "Test of Financial Literacy": Development and Measurement Characteristics
ERIC Educational Resources Information Center
Walstad, William B.; Rebeck, Ken
2017-01-01
The "Test of Financial Literacy" (TFL) was created to measure the financial knowledge of high school students. Its content is based on the standards and benchmarks stated in the "National Standards for Financial Literacy" (Council for Economic Education 2013). The test development process involved extensive item writing and…
Ownby, Raymond L; Acevedo, Amarilis; Waldrop-Valverde, Drenna; Jacobs, Robin J; Caballero, Joshua; Davenport, Rosemary; Homs, Ana-Maria; Czaja, Sara J; Loewenstein, David
2013-01-01
Current measures of health literacy have been criticized on a number of grounds, including use of a limited range of content, development on small and atypical patient groups, and poor psychometric characteristics. In this paper, we report the development and preliminary validation of a new computer-administered and -scored health literacy measure addressing these limitations. Items in the measure reflect a wide range of content related to health promotion and maintenance as well as care for diseases. The development process has focused on creating a measure that will be useful in both Spanish and English, while not requiring substantial time for clinician training and individual administration and scoring. The items incorporate several formats, including questions based on brief videos, which allow for the assessment of listening comprehension and the skills related to obtaining information on the Internet. In this paper, we report the interim analyses detailing the initial development and pilot testing of the items (phase 1 of the project) in groups of Spanish and English speakers. We then describe phase 2, which included a second round of testing of the items, in new groups of Spanish and English speakers, and evaluation of the new measure's reliability and validity in relation to other measures. Data are presented that show that four scales (general health literacy, numeracy, conceptual knowledge, and listening comprehension), developed through a process of item and factor analyses, have significant relations to existing measures of health literacy.
Defining the medical home: the Oregon experience.
Stenger, Robert J; Smith, Jeanene; McMullan, J Bart; Rodriguez, Glenn S; Dorr, David A; Minniti, Mary; Jaffe, Arthur; Pollack, David; Anderson, Mitchell; Kilo, Charles M; Saultz, John W
2012-01-01
The patient-centered medical home (PCMH) is emerging as a key strategy to improve health outcomes, reduce total costs, and strengthen primary care, but a myriad of operational measures of the PCMH have emerged. In 2009, the state of Oregon convened a public, legislatively mandated committee charged with developing PCMH measures. We report on the process of, outcomes of, and lessons learned by this committee. The Oregon PCMH advisory committee was appointed by the director of the Oregon Department of Human Services and held 7 public meetings between October 2009 and February 2010. The committee engaged a diverse group of Oregon stakeholders, including a variety of practicing primary care physicians. The committee developed a PCMH measurement framework, including 6 core attributes, 15 standards, and 27 individual measures. Key successes of the committee's work were to describe PCMH core attributes and functions in patient-centered language and to achieve consensus among a diverse group of stakeholders. Oregon's PCMH advisory committee engaged local stakeholders in a process that resulted in a shared PCMH measurement framework and addressed stakeholders' concerns. The state of Oregon now has implemented a PCMH program using the framework developed by the PCMH advisory committee. The Oregon experience demonstrates that a brief public process can be successful in producing meaningful consensus on PCMH roles and functions and advancing PCMH policy.
Thermal Remote Sensing and the Thermodynamics of Ecosystem Development
NASA Technical Reports Server (NTRS)
Luvall, Jeffrey C.; Kay, James J.; Fraser, Roydon F.
2000-01-01
Thermal remote sensing can provide environmental measuring tools with capabilities for measuring ecosystem development and integrity. Recent advances in applying principles of nonequilibrium thermodynamics to ecology provide fundamental insights into energy partitioning in ecosystems. Ecosystems are nonequilibrium systems, open to material and energy flows, which grow and develop structures and processes to increase energy degradation. More developed terrestrial ecosystems will be more effective at dissipating the solar gradient (degrading its energy content). This can be measured by the effective surface temperature of the ecosystem on a landscape scale.
Imaging-based optical caliper for objects in hot manufacturing processes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Howard
OG Technologies, Inc. (OGT), in conjunction with its industrial and academic partners, proposes to develop an Imaging-Based Optical Caliper (hereafter referred to as OC) for Objects in Hot Manufacturing Processes. The goal is to develop and demonstrate the OC with the synergy of OGT's current technological pool and other innovations to provide a light weight, robust, safe and accurate portable dimensional measurement device for hot objects with integrated wireless communication capacity to enable real time process control. The technical areas of interest in this project are the combination of advanced imaging, Sensor Fusion, and process control. OGT believes that themore » synergistic interactions between its current set of technologies and other innovations could deliver products that are viable and have high impact in the hot manufacture processes, such as steel making, steel rolling, open die forging, and glass industries, resulting in a new energy efficient control paradigm in the operations through improved yield, prolonged tool life and improved quality. In-line dimension measurement and control is of interest to the steel makers, yet current industry focus is on the final product dimension only instead of whole process due to the limit of man power, system cost and operator safety concerns. As sensor technologies advances, the industry started to see the need to enforce better dimensional control throughout the process, but lack the proper tools to do so. OGT along with its industrial partners represent the indigenous effort of technological development to serve the US steel industry. The immediate market that can use and get benefited from the proposed OC is the Steel Industry. The deployment of the OC has the potential to provide benefits in reduction of energy waste, CO2 emission, waste water amount, toxic waste, and so forth. The potential market after further expended function includes Hot Forging and Freight Industries. The OC prototypes were fabricated, and were progressively tested on-site in several steel mill and hot forging facilities for evaluation. Software refinements and new calibration procedures were also carried out to overcome the discovered glitches. Progress was presented to the hot manufacture facilities worldwide. Evidence showed a great interest and practical need for this product. OGT is in the pilot commercialization mode for this new development. The R&D team also successfully developed a 3D measurement function with no additional investment of hardware or equipment to measure low or room temperature object dimensions. Several tests were conducted in the reality environment to evaluate the measurement results. This new application will require additional development in product design.« less
NASA Astrophysics Data System (ADS)
Li, Qing; Lin, Haibo; Xiu, Yu-Feng; Wang, Ruixue; Yi, Chuijie
The test platform of wheat precision seeding based on image processing techniques is designed to develop the wheat precision seed metering device with high efficiency and precision. Using image processing techniques, this platform gathers images of seeds (wheat) on the conveyer belt which are falling from seed metering device. Then these data are processed and analyzed to calculate the qualified rate, reseeding rate and leakage sowing rate, etc. This paper introduces the whole structure, design parameters of the platform and hardware & software of the image acquisition system were introduced, as well as the method of seed identification and seed-space measurement using image's threshold and counting the seed's center. By analyzing the experimental result, the measurement error is less than ± 1mm.
Tomassini, R; Rossi, G; Brouckaert, J-F
2016-10-01
A simultaneous blade tip timing (BTT) and blade tip clearance (BTC) measurement system enables the determination of turbomachinery blade vibrations and ensures the monitoring of the existing running gaps between the blade tip and the casing. This contactless instrumentation presents several advantages compared to the well-known telemetry system with strain gauges, at the cost of a more complex data processing procedure. The probes used can be optical, capacitive, eddy current as well as microwaves, everyone with its dedicated electronics and many existing different signal processing algorithms. Every company working in this field has developed its own processing method and sensor technology. Hence, repeating the same test with different instrumentations, the answer is often different. Moreover, rarely it is possible to achieve reliability for in-service measurements. Developments are focused on innovative instrumentations and a common standard. This paper focuses on the results achieved using a novel magnetoresistive sensor for simultaneous tip timing and tip clearance measurements. The sensor measurement principle is described. The sensitivity to gap variation is investigated. In terms of measurement of vibrations, experimental investigations were performed at the Air Force Institute of Technology (ITWL, Warsaw, Poland) in a real aeroengine and in the von Karman Institute (VKI) R2 compressor rig. The advantages and limitations of the magnetoresistive probe for turbomachinery testing are highlighted.
NASA Astrophysics Data System (ADS)
Tomassini, R.; Rossi, G.; Brouckaert, J.-F.
2016-10-01
A simultaneous blade tip timing (BTT) and blade tip clearance (BTC) measurement system enables the determination of turbomachinery blade vibrations and ensures the monitoring of the existing running gaps between the blade tip and the casing. This contactless instrumentation presents several advantages compared to the well-known telemetry system with strain gauges, at the cost of a more complex data processing procedure. The probes used can be optical, capacitive, eddy current as well as microwaves, everyone with its dedicated electronics and many existing different signal processing algorithms. Every company working in this field has developed its own processing method and sensor technology. Hence, repeating the same test with different instrumentations, the answer is often different. Moreover, rarely it is possible to achieve reliability for in-service measurements. Developments are focused on innovative instrumentations and a common standard. This paper focuses on the results achieved using a novel magnetoresistive sensor for simultaneous tip timing and tip clearance measurements. The sensor measurement principle is described. The sensitivity to gap variation is investigated. In terms of measurement of vibrations, experimental investigations were performed at the Air Force Institute of Technology (ITWL, Warsaw, Poland) in a real aeroengine and in the von Karman Institute (VKI) R2 compressor rig. The advantages and limitations of the magnetoresistive probe for turbomachinery testing are highlighted.
Improving HIV outcomes in resource-limited countries: the importance of quality indicators.
Ahonkhai, Aima A; Bassett, Ingrid V; Ferris, Timothy G; Freedberg, Kenneth A
2012-11-24
Resource-limited countries increasingly depend on quality indicators to improve outcomes within HIV treatment programs, but indicators of program performance suitable for use at the local program level remain underdeveloped. Using the existing literature as a guide, we applied standard quality improvement (QI) concepts to the continuum of HIV care from HIV diagnosis, to enrollment and retention in care, and highlighted critical service delivery process steps to identify opportunities for performance indicator development. We then identified existing indicators to measure program performance, citing examples used by pivotal donor agencies, and assessed their feasibility for use in surveying local program performance. Clinical delivery steps without existing performance measures were identified as opportunities for measure development. Using National Quality Forum (NQF) criteria as a guide, we developed measurement concepts suitable for use at the local program level that address existing gaps in program performance assessment. This analysis of the HIV continuum of care identified seven critical process steps providing numerous opportunities for performance measurement. Analysis of care delivery process steps and the application of NQF criteria identified 24 new measure concepts that are potentially useful for improving operational performance in HIV care at the local level. An evidence-based set of program-level quality indicators is critical for the improvement of HIV care in resource-limited settings. These performance indicators should be utilized as treatment programs continue to grow.
Development and Validation of Instruments to Measure Learning of Expert-Like Thinking
ERIC Educational Resources Information Center
Adams, Wendy K.; Wieman, Carl E.
2011-01-01
This paper describes the process for creating and validating an assessment test that measures the effectiveness of instruction by probing how well that instruction causes students in a class to think like experts about specific areas of science. The design principles and process are laid out and it is shown how these align with professional…
Bonow, Robert O; Douglas, Pamela S; Buxton, Alfred E; Cohen, David J; Curtis, Jeptha P; Delong, Elizabeth; Drozda, Joseph P; Ferguson, T Bruce; Heidenreich, Paul A; Hendel, Robert C; Masoudi, Frederick A; Peterson, Eric D; Taylor, Allen J
2011-09-27
Consistent with the growing national focus on healthcare quality, the American College of Cardiology Foundation (ACCF) and the American Heart Association (AHA) have taken a leadership role over the past decade in developing measures of the quality of cardiovascular care by convening a joint ACCF/AHA Task Force on Performance Measures. The Task Force is charged with identifying the clinical topics appropriate for the development of performance measures and with assembling writing committees composed of clinical and methodological experts in collaboration with appropriate subspecialty societies. The Task Force has also created methodology documents that offer guidance in the development of process, outcome, composite, and efficiency measures. Cardiovascular performance measures using existing ACCF/AHA methodology are based on Class I or Class III guidelines recommendations, usually with Level A evidence. These performance measures, based on evidence-based ACCF/AHA guidelines, remain the most rigorous quality measures for both internal quality improvement and public reporting. However, many of the tools for diagnosis and treatment of cardiovascular disease involve advanced technologies, such as cardiac imaging, for which there are often no underlying guideline documents. Because these technologies affect the quality of cardiovascular care and also have the potential to contribute to cardiovascular health expenditures, there is a need for more critical assessment of the use of technology, including the development of quality and performance measures in areas in which guideline recommendations are absent. The evaluation of quality in the use of cardiovascular technologies requires consideration of multiple parameters that differ from other healthcare processes. The present document describes methodology for development of 2 new classes of quality measures in these situations, appropriate use measures and structure/safety measures. Appropriate use measures are based on specific indications, processes, or parameters of care for which high level of evidence data and Class I or Class III guideline recommendations may be lacking but are addressed in ACCF appropriate use criteria documents. Structure/safety measures represent measures developed to address structural aspects of the use of healthcare technology (e.g., laboratory accreditation, personnel training, and credentialing) or quality issues related to patient safety when there are neither guidelines recommendations nor appropriate use criteria. Although the strength of evidence for appropriate use measures and structure/safety measures may not be as strong as that for formal performance measures, they are quality measures that are otherwise rigorously developed, reviewed, tested, and approved in the same manner as ACCF/AHA performance measures. The ultimate goal of the present document is to provide direction in defining and measuring the appropriate use-avoiding not only underuse but also overuse and misuse-and proper application of cardiovascular technology and to describe how such appropriate use measures and structure/safety measures might be developed for the purposes of quality improvement and public reporting. It is anticipated that this effort will help focus the national dialogue on the use of cardiovascular technology and away from the current concerns about volume and cost alone to a more holistic emphasis on value.
Travers, Brittany G.; Bigler, Erin D.; Tromp, Do P. M.; Adluru, Nagesh; Froehlich, Alyson L.; Ennis, Chad; Lange, Nicholas; Nielsen, Jared A.; Prigge, Molly B. D.; Alexander, Andrew L.; Lainhart, Janet E.
2014-01-01
The present study used an accelerated longitudinal design to examine group differences and age-related changes in processing speed in 81 individuals with Autism Spectrum Disorder (ASD) compared to 56 age-matched individuals with typical development (ages 6–39 years). Processing speed was assessed using the Wechsler Intelligence Scale for Children-3rd edition (WISC-III) and the Wechsler Adult Intelligence Scale-3rd edition (WAIS-III). Follow-up analyses examined processing speed subtest performance and relations between processing speed and white matter microstructure (as measured with diffusion tensor imaging [DTI] in a subset of these participants). After controlling for full scale IQ, the present results show that processing speed index standard scores were on average 12 points lower in the group with ASD compared to the group with typical development. There were, however, no significant group differences in standard score age-related changes within this age range. For subtest raw scores, the group with ASD demonstrated robustly slower processing speeds in the adult versions of the IQ test (i.e., WAIS-III) but not in the child versions (WISC-III), even though age-related changes were similar in both the ASD and typically developing groups. This pattern of results may reflect difficulties that become increasingly evident in ASD on more complex measures of processing speed. Finally, DTI measures of whole-brain white matter microstructure suggested that fractional anisotropy (but not mean diffusivity, radial diffusivity, or axial diffusivity) made significant but small-sized contributions to processing speed standard scores across our entire sample. Taken together, the present findings suggest that robust decreases in processing speed may be present in ASD, more pronounced in adulthood, and partially attributable to white matter microstructural integrity. PMID:24269298
The Development of NOAA Education Common Outcome Performance Measures (Invited)
NASA Astrophysics Data System (ADS)
Baek, J.
2013-12-01
The National Oceanic and Atmospheric Administration (NOAA) Education Council has embarked on an ambitious Monitoring and Evaluation (M&E) project that will allow it to assess education program outcomes and impacts across the agency, line offices, and programs. The purpose of this internal effort is to link outcome measures to program efforts and to evaluate the success of the agency's education programs in meeting the strategic goals. Using an outcome-based evaluation approach, the NOAA Education Council is developing two sets of common outcome performance measures, environmental stewardship and professional development. This presentation will examine the benefits and tradeoffs of common outcome performance measures that collect program results across a portfolio of education programs focused on common outcomes. Common outcome performance measures have a few benefits to our agency and to the climate education field at large. The primary benefit is shared understanding, which comes from our process for writing common outcome performance measures. Without a shared and agreed upon set of definitions for the measure of an outcome, the reported results may not be measuring the same things and would incorrectly indicate levels of performance. Therefore, our writing process relies on a commitment to developing a shared set of definitions based on consensus. We hope that by taking the time to debate and coming to agreement across a diverse set of programs, the strength of our common measures can indicate real progress towards outcomes we care about. An additional benefit is that these common measures can be adopted and adapted by other agencies and organizations that share similar theories of change. The measures are not without their drawbacks, and we do make tradeoffs as part of our process in order to continue making progress. We know that any measure is necessarily a narrow slice of performance. A slice that may not best represent the unique and remarkable contribution of an individual program, but does reflect a variety of contributions along a single dimension across a large portfolio of programs. The process has ended up pushing our working group to call for even more measures, to capture an increasing number of dimensions that reflect the nature of the portfolio of programs. This past year we have been working on developing two sets of common outcome performance measures for professional development (PD) and stewardship education programs. The outcome we chose for PD programs was the use of what was learned in the educator's practice. The outcome we chose for stewardship programs was the stewardship behaviors that participants learn and practice. The measurement of these outcomes will inform whether our strategies are having their intended impact. By knowing how and how much these outcomes are occurring as a result of our program, we can improve program performance over time. The common outcome performance measures help demonstrate how these programs engage audiences in supporting NOAA's mission. As AGU climate literacy community continues to grow, it is important to consider an approach to demonstrate the community's contribution to the Nation's climate literacy. Development of common outcome performance measures is one approach that could help focus the community in meeting its goals.
Science--A Process Approach, Product Development Report No. 8.
ERIC Educational Resources Information Center
Sanderson, Barbara A.; Kratochvil, Daniel W.
Science - A Process Approach, a science program for grades kindergarten through sixth, mainly focuses on scientific processes: observing, classifying, using numbers, measuring, space/time relationships, communicating, predicting, inferring, defining operationally, formulating hypotheses, interpreting data, controlling variables, and experimenting.…
ERIC Educational Resources Information Center
Park, Bitnara Jasmine; Alonzo, Julie; Tindal, Gerald
2011-01-01
This technical report describes the process of development and piloting of reading comprehension measures that are appropriate for seventh-grade students as part of an online progress screening and monitoring assessment system, http://easycbm.com. Each measure consists of an original fictional story of approximately 1,600 to 1,900 words with 20…
Fundamental movement skills testing in children with cerebral palsy.
Capio, Catherine M; Sit, Cindy H P; Abernethy, Bruce
2011-01-01
To examine the inter-rater reliability and comparative validity of product-oriented and process-oriented measures of fundamental movement skills among children with cerebral palsy (CP). In total, 30 children with CP aged 6 to 14 years (Mean = 9.83, SD = 2.5) and classified in Gross Motor Function Classification System (GMFCS) levels I-III performed tasks of catching, throwing, kicking, horizontal jumping and running. Process-oriented assessment was undertaken using a number of components of the Test of Gross Motor Development (TGMD-2), while product-oriented assessment included measures of time taken, distance covered and number of successful task completions. Cohen's kappa, Spearman's rank correlation coefficient and tests to compare correlated correlation coefficients were performed. Very good inter-rater reliability was found. Process-oriented measures for running and jumping had significant associations with GMFCS, as did seven product-oriented measures for catching, throwing, kicking, running and jumping. Product-oriented measures of catching, kicking and running had stronger associations with GMFCS than the corresponding process-oriented measures. Findings support the validity of process-oriented measures for running and jumping and of product-oriented measures of catching, throwing, kicking, running and jumping. However, product-oriented measures for catching, kicking and running appear to have stronger associations with functional abilities of children with CP, and are thus recommended for use in rehabilitation processes.
High-pressure spectroscopic measurement on diffusion with a diamond-anvil cell
NASA Astrophysics Data System (ADS)
Aoki, K.; Katoh, Eriko; Yamawaki, H.; Fujihisa, H.; Sakashita, M.
2003-04-01
We report a diamond-anvil-cell (DAC) technique developed for spectroscopic measurement on the diffusion process in molecular solids at high pressure. The diffusion processes of atoms, molecules, or their ionic species are investigated for a bilayer specimen by measuring the variation of infrared vibrational spectra with time. The experimental procedures for the protonic and molecular diffusion measurements on ice at 400 K and 10.2 GPa are presented as an example study. The in situ spectroscopic technique with a DAC significantly extends the pressure range accessible for diffusion measurement. The diffusion process at a rate of 10-16-10-14 m2/s can currently be observed at temperatures of 300-600 K and pressures up to several tens of gigaPascals.
[Development of Nanotechnology for X-Ray Astronomy Instrumentation
NASA Technical Reports Server (NTRS)
Schattenburg, Mark L.
2004-01-01
This Research Grant provides support for development of nanotechnology for x-ray astronomy instrumentation. MIT has made significant progress in several development areas. In the last year we have made considerable progress in demonstrating the high-fidelity patterning and replication of x-ray reflection gratings. We developed a process for fabricating blazed gratings in silicon with extremely smooth and sharp sawtooth profiles, and developed a nanoimprint process for replication. We also developed sophisticated new fixturing for holding thin optics during metrology without causing distortion. We developed a new image processing algorithm for our Shack-Hartmann tool that uses Zernike polynomials. This has resulted in much more accurate and repeatable measurements on thin optics.
Determinants of job stress in chemical process industry: A factor analysis approach.
Menon, Balagopal G; Praveensal, C J; Madhu, G
2015-01-01
Job stress is one of the active research domains in industrial safety research. The job stress can result in accidents and health related issues in workers in chemical process industries. Hence it is important to measure the level of job stress in workers so as to mitigate the same to avoid the worker's safety related problems in the industries. The objective of this study is to determine the job stress factors in the chemical process industry in Kerala state, India. This study also aims to propose a comprehensive model and an instrument framework for measuring job stress levels in the chemical process industries in Kerala, India. The data is collected through a questionnaire survey conducted in chemical process industries in Kerala. The collected data out of 1197 surveys is subjected to principal component and confirmatory factor analysis to develop the job stress factor structure. The factor analysis revealed 8 factors that influence the job stress in process industries. It is also found that the job stress in employees is most influenced by role ambiguity and the least by work environment. The study has developed an instrument framework towards measuring job stress utilizing exploratory factor analysis and structural equation modeling.
ATM Coastal Topography - Louisiana, 2001: UTM Zone 16 (Part 2 of 2)
Yates, Xan; Nayegandhi, Amar; Brock, John C.; Sallenger, Asbury H.; Klipp, Emily S.; Wright, C. Wayne
2009-01-01
These remotely sensed, geographically referenced elevation measurements of lidar-derived first-surface (FS) topography were produced collaboratively by the U.S. Geological Survey (USGS), Florida Integrated Science Center (FISC), St. Petersburg, FL, and the National Aeronautics and Space Administration (NASA), Wallops Flight Facility, VA. This project provides highly detailed and accurate datasets of a portion of the Louisiana coastline beach face within UTM Zone 16, from Grand Isle to the Chandeleur Islands, acquired September 7 and 9, 2001. The datasets are made available for use as a management tool to research scientists and natural-resource managers. An innovative scanning lidar instrument originally developed by NASA, and known as the Airborne Topographic Mapper (ATM), was used during data acquisition. The ATM system is a scanning lidar system that measures high-resolution topography of the land surface and incorporates a green-wavelength laser operating at pulse rates of 2 to 10 kilohertz. Measurements from the laser-ranging device are coupled with data acquired from inertial navigation system (INS) attitude sensors and differentially corrected global positioning system (GPS) receivers to measure topography of the surface at accuracies of +/-15 centimeters. The nominal ATM platform is a Twin Otter or P-3 Orion aircraft, but the instrument may be deployed on a range of light aircraft. Elevation measurements were collected over the survey area using the ATM system, and the resulting data were then processed using the Airborne Lidar Processing System (ALPS), a custom-built processing system developed in a NASA-USGS collaboration. ALPS supports the exploration and processing of lidar data in an interactive or batch mode. Modules for presurvey flight-line definition, flight-path plotting, lidar raster and waveform investigation, and digital camera image playback have been developed. Processing algorithms have been developed to extract the range to the first and last significant return within each waveform. ALPS is used routinely to create maps that represent submerged or first-surface topography.
ATM Coastal Topography-Louisiana, 2001: UTM Zone 15 (Part 1 of 2)
Yates, Xan; Nayegandhi, Amar; Brock, John C.; Sallenger, A.H.; Klipp, Emily S.; Wright, C. Wayne
2010-01-01
These remotely sensed, geographically referenced elevation measurements of lidar-derived first-surface (FS) topography were produced collaboratively by the U.S. Geological Survey (USGS), Florida Integrated Science Center (FISC), St. Petersburg, FL, and the National Aeronautics and Space Administration (NASA), Wallops Flight Facility, VA. This project provides highly detailed and accurate datasets of a portion of the Louisiana coastline beach face within UTM Zone 15, from Isles Dernieres to Grand Isle, acquired September 7 and 10, 2001. The datasets are made available for use as a management tool to research scientists and natural-resource managers. An innovative scanning lidar instrument originally developed by NASA, and known as the Airborne Topographic Mapper (ATM), was used during data acquisition. The ATM system is a scanning lidar system that measures high-resolution topography of the land surface and incorporates a green-wavelength laser operating at pulse rates of 2 to 10 kilohertz. Measurements from the laser-ranging device are coupled with data acquired from inertial navigation system (INS) attitude sensors and differentially corrected global positioning system (GPS) receivers to measure topography of the surface at accuracies of +/-15 centimeters. The nominal ATM platform is a Twin Otter or P-3 Orion aircraft, but the instrument may be deployed on a range of light aircraft. Elevation measurements were collected over the survey area using the ATM system, and the resulting data were then processed using the Airborne Lidar Processing System (ALPS), a custom-built processing system developed in a NASA-USGS collaboration. ALPS supports the exploration and processing of lidar data in an interactive or batch mode. Modules for presurvey flight-line definition, flight-path plotting, lidar raster and waveform investigation, and digital camera image playback have been developed. Processing algorithms have been developed to extract the range to the first and last significant return within each waveform. ALPS is used routinely to create maps that represent submerged or first-surface topography.
ATM Coastal Topography-Texas, 2001: UTM Zone 14
Klipp, Emily S.; Nayegandhi, Amar; Brock, John C.; Sallenger, A.H.; Bonisteel, Jamie M.; Yates, Xan; Wright, C. Wayne
2009-01-01
These remotely sensed, geographically referenced elevation measurements of lidar-derived first-surface (FS) topography were produced collaboratively by the U.S. Geological Survey (USGS), Florida Integrated Science Center (FISC), St. Petersburg, FL, and the National Aeronautics and Space Administration (NASA), Wallops Flight Facility, VA. This project provides highly detailed and accurate datasets of a portion of the Texas coastline within UTM zone 14, acquired October 12-13, 2001. The datasets are made available for use as a management tool to research scientists and natural-resource managers. An innovative scanning lidar instrument originally developed by NASA, and known as the Airborne Topographic Mapper (ATM), was used during data acquisition. The ATM system is a scanning lidar system that measures high-resolution topography of the land surface and incorporates a green-wavelength laser operating at pulse rates of 2 to 10 kilohertz. Measurements from the laser-ranging device are coupled with data acquired from inertial navigation system (INS) attitude sensors and differentially corrected global positioning system (GPS) receivers to measure topography of the surface at accuracies of +/-15 centimeters. The nominal ATM platform is a Twin Otter or P-3 Orion aircraft, but the instrument may be deployed on a range of light aircraft. Elevation measurements were collected over the survey area using the ATM system, and the resulting data were then processed using the Airborne Lidar Processing System (ALPS), a custom-built processing system developed in a NASA-USGS collaboration. ALPS supports the exploration and processing of lidar data in an interactive or batch mode. Modules for presurvey flight-line definition, flight-path plotting, lidar raster and waveform investigation, and digital camera image playback have been developed. Processing algorithms have been developed to extract the range to the first and last significant return within each waveform. ALPS is used routinely to create maps that represent submerged or first-surface topography.
ATM Coastal Topography-Texas, 2001: UTM Zone 15
Klipp, Emily S.; Nayegandhi, Amar; Brock, John C.; Sallenger, A.H.; Bonisteel, Jamie M.; Yates, Xan; Wright, C. Wayne
2009-01-01
These remotely sensed, geographically referenced elevation measurements of lidar-derived first-surface (FS) topography were produced collaboratively by the U.S. Geological Survey (USGS), Florida Integrated Science Center (FISC), St. Petersburg, FL, and the National Aeronautics and Space Administration (NASA), Wallops Flight Facility, VA. This project provides highly detailed and accurate datasets of a portion of the Texas coastline within UTM zone 15, from Matagorda Peninsula to Galveston Island, acquired October 12-13, 2001. The datasets are made available for use as a management tool to research scientists and natural-resource managers. An innovative scanning lidar instrument originally developed by NASA, and known as the Airborne Topographic Mapper (ATM), was used during data acquisition. The ATM system is a scanning lidar system that measures high-resolution topography of the land surface and incorporates a green-wavelength laser operating at pulse rates of 2 to 10 kilohertz. Measurements from the laser-ranging device are coupled with data acquired from inertial navigation system (INS) attitude sensors and differentially corrected global positioning system (GPS) receivers to measure topography of the surface at accuracies of +/-15 centimeters. The nominal ATM platform is a Twin Otter or P-3 Orion aircraft, but the instrument may be deployed on a range of light aircraft. Elevation measurements were collected over the survey area using the ATM system, and the resulting data were then processed using the Airborne Lidar Processing System (ALPS), a custom-built processing system developed in a NASA-USGS collaboration. ALPS supports the exploration and processing of lidar data in an interactive or batch mode. Modules for presurvey flight-line definition, flight-path plotting, lidar raster and waveform investigation, and digital camera image playback have been developed. Processing algorithms have been developed to extract the range to the first and last significant return within each waveform. ALPS is used routinely to create maps that represent submerged or first-surface topography.
ATM Coastal Topography-Florida 2001: Western Panhandle
Yates, Xan; Nayegandhi, Amar; Brock, John C.; Sallenger, A.H.; Bonisteel, Jamie M.; Klipp, Emily S.; Wright, C. Wayne
2009-01-01
These remotely sensed, geographically referenced elevation measurements of Lidar-derived first surface (FS) topography were produced collaboratively by the U.S. Geological Survey (USGS), Florida Integrated Science Center (FISC), St. Petersburg, FL, and the National Aeronautics and Space Administration (NASA), Wallops Flight Facility, VA. This project provides highly detailed and accurate datasets of the western Florida panhandle coastline, acquired October 2-4 and 7-10, 2001. The datasets are made available for use as a management tool to research scientists and natural resource managers. An innovative scanning Lidar instrument originally developed by NASA, and known as the Airborne Topographic Mapper (ATM), was used during data acquisition. The ATM system is a scanning Lidar system that measures high-resolution topography of the land surface and incorporates a green-wavelength laser operating at pulse rates of 2 to 10 kilohertz. Measurements from the laser-ranging device are coupled with data acquired from inertial navigation system (INS) attitude sensors and differentially corrected global positioning system (GPS) receivers to measure topography of the surface at accuracies of +/-15 centimeters. The nominal ATM platform is a Twin Otter or P-3 Orion aircraft, but the instrument may be deployed on a range of light aircraft. Elevation measurements were collected over the survey area using the ATM system, and the resulting data were then processed using the Airborne Lidar Processing System (ALPS), a custom-built processing system developed in a NASA-USGS collaboration. ALPS supports the exploration and processing of Lidar data in an interactive or batch mode. Modules for presurvey flight line definition, flight path plotting, Lidar raster and waveform investigation, and digital camera image playback have been developed. Processing algorithms have been developed to extract the range to the first and last significant return within each waveform. ALPS is routinely used to create maps that represent submerged or first surface topography.
ATM Coastal Topography-Mississippi, 2001
Nayegandhi, Amar; Yates, Xan; Brock, John C.; Sallenger, A.H.; Klipp, Emily S.; Wright, C. Wayne
2009-01-01
These remotely sensed, geographically referenced elevation measurements of lidar-derived first-surface (FS) topography were produced collaboratively by the U.S. Geological Survey (USGS), Florida Integrated Science Center (FISC), St. Petersburg, FL, and the National Aeronautics and Space Administration (NASA), Wallops Flight Facility, VA. This project provides highly detailed and accurate datasets of the Mississippi coastline, from Lakeshore to Petit Bois Island, acquired September 9-10, 2001. The datasets are made available for use as a management tool to research scientists and natural-resource managers. An innovative scanning lidar instrument originally developed by NASA, and known as the Airborne Topographic Mapper (ATM), was used during data acquisition. The ATM system is a scanning lidar system that measures high-resolution topography of the land surface and incorporates a green-wavelength laser operating at pulse rates of 2 to 10 kilohertz. Measurements from the laser-ranging device are coupled with data acquired from inertial navigation system (INS) attitude sensors and differentially corrected global positioning system (GPS) receivers to measure topography of the surface at accuracies of +/-15 centimeters. The nominal ATM platform is a Twin Otter or P-3 Orion aircraft, but the instrument may be deployed on a range of light aircraft. Elevation measurements were collected over the survey area using the ATM system, and the resulting data were then processed using the Airborne Lidar Processing System (ALPS), a custom-built processing system developed in a NASA-USGS collaboration. ALPS supports the exploration and processing of lidar data in an interactive or batch mode. Modules for presurvey flight-line definition, flight-path plotting, lidar raster and waveform investigation, and digital camera image playback have been developed. Processing algorithms have been developed to extract the range to the first and last significant return within each waveform. ALPS is used routinely to create maps that represent submerged or first-surface topography.
Kliemann, Dorit; Rosenblau, Gabriela; Bölte, Sven; Heekeren, Hauke R.; Dziobek, Isabel
2013-01-01
Recognizing others' emotional states is crucial for effective social interaction. While most facial emotion recognition tasks use explicit prompts that trigger consciously controlled processing, emotional faces are almost exclusively processed implicitly in real life. Recent attempts in social cognition suggest a dual process perspective, whereby explicit and implicit processes largely operate independently. However, due to differences in methodology the direct comparison of implicit and explicit social cognition has remained a challenge. Here, we introduce a new tool to comparably measure implicit and explicit processing aspects comprising basic and complex emotions in facial expressions. We developed two video-based tasks with similar answer formats to assess performance in respective facial emotion recognition processes: Face Puzzle, implicit and explicit. To assess the tasks' sensitivity to atypical social cognition and to infer interrelationship patterns between explicit and implicit processes in typical and atypical development, we included healthy adults (NT, n = 24) and adults with autism spectrum disorder (ASD, n = 24). Item analyses yielded good reliability of the new tasks. Group-specific results indicated sensitivity to subtle social impairments in high-functioning ASD. Correlation analyses with established implicit and explicit socio-cognitive measures were further in favor of the tasks' external validity. Between group comparisons provide first hints of differential relations between implicit and explicit aspects of facial emotion recognition processes in healthy compared to ASD participants. In addition, an increased magnitude of between group differences in the implicit task was found for a speed-accuracy composite measure. The new Face Puzzle tool thus provides two new tasks to separately assess explicit and implicit social functioning, for instance, to measure subtle impairments as well as potential improvements due to social cognitive interventions. PMID:23805122
NASA Astrophysics Data System (ADS)
Roellig, Mike; Meier, Karsten; Metasch, Rene
2010-11-01
The recent development of 3D-integrated electronic packages is characterized by the need to increase the diversity of functions and to miniaturize. Currently many 3D-integration concepts are being developed and all of them demand new materials, new designs and new processing technologies. The combination of simulation and experimental investigation becomes increasingly accepted since simulations help to shorten the R&D cycle time and reduce costs. Numerical calculations like the Finite-Element-Method are strong tools to calculate stress conditions in electronic packages resulting from thermal strains due to the manufacturing process and environmental loads. It is essential for the application of numerical calculations that the material data is accurate and describes sufficiently the physical behaviour. The developed machine allows the measurement of time and temperature dependent micromechanical properties of solder joints. Solder joints, which are used to mechanically and electrically connect different packages, are physically measured as they leave the process. This allows accounting for process influences, which may change material properties. Additionally, joint sizes and metallurgical interactions between solder and under bump metallization can be respected by this particular measurement. The measurement allows the determination of material properties within a temperature range of 20° C-200° C. Further, the time dependent creep deformation can be measured within a strain-rate range of 10-31/s-10-81/s. Solder alloys based on Sn-Ag/Sn-Ag-Cu with additionally impurities and joint sizes down to O/ 200 μm were investigated. To finish the material characterization process the material model coefficient were extracted by FEM-Simulation to increase the accuracy of data.
Nuclear data for r-process models from ion trap measurements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clark, Jason, E-mail: jclark@anl.gov
2016-06-21
To truly understand how elements are created in the universe via the astrophysical r process, accurate nuclear data are required. Historically, the isotopes involved in the r process have been difficult to access for study, but the development of new facilities and measurement techniques have put many of the r-process isotopes within reach. This paper will discuss the new CARIBU facility at Argonne National Laboratory and two pieces of experimental equipment, the Beta-decay Paul Trap and the Canadian Penning Trap, that will dramatically increase the nuclear data available for models of the astrophysical r process.
Analytical method for promoting process capability of shock absorption steel.
Sung, Wen-Pei; Shih, Ming-Hsiang; Chen, Kuen-Suan
2003-01-01
Mechanical properties and low cycle fatigue are two factors that must be considered in developing new type steel for shock absorption. Process capability and process control are significant factors in achieving the purpose of research and development programs. Often-used evaluation methods failed to measure process yield and process centering; so this paper uses Taguchi loss function as basis to establish an evaluation method and the steps for assessing the quality of mechanical properties and process control of an iron and steel manufacturer. The establishment of this method can serve the research and development and manufacturing industry and lay a foundation in enhancing its process control ability to select better manufacturing processes that are more reliable than decision making by using the other commonly used methods.
How Evaluation Processes Affect the Professional Development of Five Teachers in Higher Education
ERIC Educational Resources Information Center
Shagrir, Leah
2012-01-01
This paper presents research that investigates the nature of the connection between the professional development of five teachers in higher education and the evaluation processes they have to undergo. Since teaching, scholarship, and service are the three components that evaluation measures, this research examines how the teachers' professional…
Idea-Based Learning: A Course Design Process to Promote Conceptual Understanding
ERIC Educational Resources Information Center
Hansen , Edmund J.
2011-01-01
Synthesizing the best current thinking about learning, course design, and promoting student achievement, this is a guide to developing college instruction that has clear purpose, is well integrated into the curriculum, and improves student learning in predictable and measurable ways. The process involves developing a transparent course blueprint,…
Gender Matters: Exploring the Process of Developing Resilience through Outdoor Adventure
ERIC Educational Resources Information Center
Overholt, Jillisa R.; Ewert, Alan
2015-01-01
This two-phase study investigates the process of developing resilience through participation in outdoor adventure programming. In this study, resilience is conceptualized as experiencing growth through a disruptive event. In the first phase, a pre-post survey measure was used to assess resilience in university students who were enrolled in a…
ERIC Educational Resources Information Center
Meng, Xiangzhi; Sai, Xiaoguang; Wang, Cixin; Wang, Jue; Sha, Shuying; Zhou, Xiaolin
2005-01-01
By measuring behavioural performance and event-related potentials (ERPs) this study investigated the extent to which Chinese school children's reading development is influenced by their skills in auditory, speech, and temporal processing. In Experiment 1, 102 normal school children's performance in pure tone temporal order judgment, tone frequency…
A sub-process view of working memory capacity: evidence from effects of speech on prose memory.
Sörqvist, Patrik; Ljungberg, Jessica K; Ljung, Robert
2010-04-01
In this article we outline a "sub-process view" of working memory capacity (WMC). This view suggests that any relationship between WMC and another construct (e.g., reading comprehension) is actually a relationship with a specific part of the WMC construct. The parts, called sub-processes, are functionally distinct and can be measured by intrusion errors in WMC tasks. Since the sub-processes are functionally distinct, some sub-process may be related to a certain phenomenon, whereas another sub-process is related to other phenomena. In two experiments we show that a sub-process (measured by immediate/current-list intrusions) is related to the effects of speech on prose memory (semantic auditory distraction), whereas another sub-process (measured by delayed/prior-list intrusions), known for its contribution to reading comprehension, is not. In Experiment 2 we developed a new WMC task called "size-comparison span" and found that the relationship between WMC and semantic auditory distraction is actually a relationship with a sub-process measured by current-list intrusions in our new task.
Thermal sensors to control polymer forming. Challenge and solutions
NASA Astrophysics Data System (ADS)
Lemeunier, F.; Boyard, N.; Sarda, A.; Plot, C.; Lefèvre, N.; Petit, I.; Colomines, G.; Allanic, N.; Bailleul, J. L.
2017-10-01
Many thermal sensors are already used, for many years, to better understand and control material forming processes, especially polymer processing. Due to technical constraints (high pressure, sealing, sensor dimensions…) the thermal measurement is often performed in the tool or close its surface. Thus, it only gives partial and disturbed information. Having reliable information about the heat flux exchanges between the tool and the material during the process would be very helpful to improve the control of the process and to favor the development of new materials. In this work, we present several sensors developed in labs to study the molding steps in forming processes. The analysis of the obtained thermal measurements (temperature, heat flux) shows the required sensitivity threshold of sensitivity of thermal sensors to be able to detect on-line the rate of thermal reaction. Based on these data, we will present new sensor designs which have been patented.
Measuring and assessing maintainability at the end of high level design
NASA Technical Reports Server (NTRS)
Briand, Lionel C.; Morasca, Sandro; Basili, Victor R.
1993-01-01
Software architecture appears to be one of the main factors affecting software maintainability. Therefore, in order to be able to predict and assess maintainability early in the development process we need to be able to measure the high-level design characteristics that affect the change process. To this end, we propose a measurement approach, which is based on precise assumptions derived from the change process, which is based on Object-Oriented Design principles and is partially language independent. We define metrics for cohesion, coupling, and visibility in order to capture the difficulty of isolating, understanding, designing and validating changes.
Neuderth, S; Lukasczik, M; Musekamp, G; Gerlich, C; Saupe-Heide, M; Löbmann, R; Vogel, H
2013-02-01
There so far is no standardized program for external quality assurance in inpatient parent-child prevention and rehabilitation in Germany. Therefore, instruments and methods of external quality assurance were developed and evaluated on behalf of the federal-level health insurance institutions. On the level of structure quality, a modular questionnaire for assessing structural features of rehabilitation/prevention centers, basic and allocation criteria as well as a checklist for visitations were developed. Structural data were collected in a nationwide survey of parent-child prevention and rehabilitation centers. Process and outcome quality data were collected in n=38 centers. Process quality was assessed using multiple methods (process-related structural features, case-related routine documentation, and incident-related patient questionnaires). Outcome quality was measured via patient questionnaires (n=1 799 patients). We used a multi-level modelling approach by adjusting relevant confounders on institutional and patient levels. The methods, instruments and analyzing procedures developed for measuring quality on the level of structure, processes and outcomes were adjusted in cooperation with all relevant stakeholders. Results are exemplarily presented for all quality assurance tools. For most of the risk-adjusted outcome parameters, we found no significant differences between institutions. For the first time, a comprehensive, standardized and generally applicable set of methods and instruments for routine use in comparative quality measurement of inpatient parent-child prevention and rehabilitation is available. However, it should be considered that the very heterogeneous field of family-oriented measures can not be covered entirely by an external quality assurance program. Therefore, methods and instruments have to be adapted continuously to the specifics of this area of health care and to new developments. © Georg Thieme Verlag KG Stuttgart · New York.
John, Mary; Jeffries, Fiona W; Acuna-Rivera, Marcela; Warren, Fiona; Simonds, Laura M
2015-01-01
Recovery has become a central concept in mental health service delivery, and several recovery-focused measures exist for adults. The concept's applicability to young people's mental health experience has been neglected, and no measures yet exist. Aim The aim of this work is to develop measures of recovery for use in specialist child and adolescent mental health services. On the basis of 21 semi-structured interviews, three recovery measures were devised, one for completion by the young person and two for completion by the parent/carer. Two parent/carer measures were devised in order to assess both their perspective on their child's recovery and their own recovery process. The questionnaires were administered to a UK sample of 47 young people (10-18 years old) with anxiety and depression and their parents, along with a measure used to routinely assess treatment progress and outcome and a measure of self-esteem. All three measures had high internal consistency (alpha ≥ 0.89). Young people's recovery scores were correlated negatively with scores on a measure used to routinely assess treatment progress and outcome (r = -0.75) and positively with self-esteem (r = 0.84). Parent and young persons' reports of the young person's recovery were positively correlated (r = 0.61). Parent report of the young person's recovery and of their own recovery process were positively correlated (r = 0.75). The three measures have the potential to be used in mental health services to assess recovery processes in young people with mental health difficulties and correspondence with symptomatic improvement. The measures provide a novel way of capturing the parental/caregiver perspective on recovery and caregivers' own wellbeing. No tools exist to evaluate recovery-relevant processes in young people treated in specialist mental health services. This study reports on the development and psychometric evaluation of three self-report recovery-relevant assessments for young people and their caregivers. Findings indicate a high degree of correspondence between young person and caregiver reports of recovery in the former. The recovery assessments correlate inversely with a standardized symptom-focused measure and positively with self-esteem. Copyright © 2014 John Wiley & Sons, Ltd.
VARTM Process Modeling of Aerospace Composite Structures
NASA Technical Reports Server (NTRS)
Song, Xiao-Lan; Grimsley, Brian W.; Hubert, Pascal; Cano, Roberto J.; Loos, Alfred C.
2003-01-01
A three-dimensional model was developed to simulate the VARTM composite manufacturing process. The model considers the two important mechanisms that occur during the process: resin flow, and compaction and relaxation of the preform. The model was used to simulate infiltration of a carbon preform with an epoxy resin by the VARTM process. The model predicted flow patterns and preform thickness changes agreed qualitatively with the measured values. However, the predicted total infiltration times were much longer than measured most likely due to the inaccurate preform permeability values used in the simulation.
Free energy surfaces from nonequilibrium processes without work measurement
NASA Astrophysics Data System (ADS)
Adib, Artur B.
2006-04-01
Recent developments in statistical mechanics have allowed the estimation of equilibrium free energies from the statistics of work measurements during processes that drive the system out of equilibrium. Here a different class of processes is considered, wherein the system is prepared and released from a nonequilibrium state, and no external work is involved during its observation. For such "clamp-and-release" processes, a simple strategy for the estimation of equilibrium free energies is offered. The method is illustrated with numerical simulations and analyzed in the context of tethered single-molecule experiments.
'Healthy Eating and Lifestyle in Pregnancy (HELP)' trial: Process evaluation framework.
Simpson, Sharon A; Cassidy, Dunla; John, Elinor
2014-07-01
We developed and tested in a cluster RCT a theory-driven group-based intervention for obese pregnant women. It was designed to support women to moderate weight gain during pregnancy and reduce BMI one year after birth, in addition to targeting secondary health and wellbeing outcomes. In line with MRC guidance on developing and evaluating complex interventions in health, we conducted a process evaluation alongside the trial. This paper describes the development of the process evaluation framework. This cluster RCT recruited 598 pregnant women. Women in the intervention group were invited to attend a weekly weight-management group. Following a review of relevant literature, we developed a process evaluation framework which outlined key process indicators that we wanted to address and how we would measure these. Central to the process evaluation was to understand the mechanism of effect of the intervention. We utilised a logic-modelling approach to describe the intervention which helped us focus on what potential mediators of intervention effect to measure, and how. The resulting process evaluation framework was designed to address 9 core elements; context, reach, exposure, recruitment, fidelity, recruitment, retention, contamination and theory-testing. These were assessed using a variety of qualitative and quantitative approaches. The logic model explained the processes by which intervention components bring about change in target outcomes through various mediators and theoretical pathways including self-efficacy, social support, self-regulation and motivation. Process evaluation is a key element in assessing the effect of any RCT. We developed a process evaluation framework and logic model, and the results of analyses using these will offer insights into why the intervention is or is not effective. Copyright © 2014.
Automated data acquisition technology development:Automated modeling and control development
NASA Technical Reports Server (NTRS)
Romine, Peter L.
1995-01-01
This report documents the completion of, and improvements made to, the software developed for automated data acquisition and automated modeling and control development on the Texas Micro rackmounted PC's. This research was initiated because a need was identified by the Metal Processing Branch of NASA Marshall Space Flight Center for a mobile data acquisition and data analysis system, customized for welding measurement and calibration. Several hardware configurations were evaluated and a PC based system was chosen. The Welding Measurement System (WMS), is a dedicated instrument strickly for use of data acquisition and data analysis. In addition to the data acquisition functions described in this thesis, WMS also supports many functions associated with process control. The hardware and software requirements for an automated acquisition system for welding process parameters, welding equipment checkout, and welding process modeling were determined in 1992. From these recommendations, NASA purchased the necessary hardware and software. The new welding acquisition system is designed to collect welding parameter data and perform analysis to determine the voltage versus current arc-length relationship for VPPA welding. Once the results of this analysis are obtained, they can then be used to develop a RAIL function to control welding startup and shutdown without torch crashing.
Idzerda, Leanne; Rader, Tamara; Tugwell, Peter; Boers, Maarten
2014-05-01
The usefulness of randomized control trials to advance clinical care depends upon the outcomes reported, but disagreement on the choice of outcome measures has resulted in inconsistency and the potential for reporting bias. One solution to this problem is the development of a core outcome set: a minimum set of outcome measures deemed critical for clinical decision making. Within rheumatology the Outcome Measures in Rheumatology (OMERACT) initiative has pioneered the development of core outcome sets since 1992. As the number of diseases addressed by OMERACT has increased and its experience in formulating core sets has grown, clarification and update of the conceptual framework and formulation of a more explicit process of area/domain core set development has become necessary. As part of the update process of the OMERACT Filter criteria to version 2, a literature review was undertaken to compare and contrast the OMERACT conceptual framework with others within and outside rheumatology. A scoping search was undertaken to examine the extent, range, and nature of conceptual frameworks for core set outcome selection in health. We searched the following resources: Cochrane Library Methods Group Register; Medline; Embase; PsycInfo; Environmental Studies and Policy Collection; and ABI/INFORM Global. We also conducted a targeted Google search. Five conceptual frameworks were identified: the WHO tripartite definition of health; the 5 Ds (discomfort, disability, drug toxicity, dollar cost, and death); the International Classification of Functioning (ICF); PROMIS (Patient-Reported Outcomes Measurement System); and the Outcomes Hierarchy. Of these, only the 5 Ds and ICF frameworks have been systematically applied in core set development. Outside the area of rheumatology, several core sets were identified; these had been developed through a limited range of consensus-based methods with varying degrees of methodological rigor. None applied a framework to ensure content validity of the end product. This scoping review reinforced the need for clear methods and standards for core set development. Based on these findings, OMERACT will make its own conceptual framework and working process more explicit. Proposals for how to achieve this were discussed at the OMERACT 11 conference.
Lee, Robert H; Bott, Marjorie J; Forbes, Sarah; Redford, Linda; Swagerty, Daniel L; Taunton, Roma Lee
2003-01-01
Understanding how quality improvement affects costs is important. Unfortunately, low-cost, reliable ways of measuring direct costs are scarce. This article builds on the principles of process improvement to develop a costing strategy that meets both criteria. Process-based costing has 4 steps: developing a flowchart, estimating resource use, valuing resources, and calculating direct costs. To illustrate the technique, this article uses it to cost the care planning process in 3 long-term care facilities. We conclude that process-based costing is easy to implement; generates reliable, valid data; and allows nursing managers to assess the costs of new or modified processes.
NASA Technical Reports Server (NTRS)
Poppendiek, H. F.; Sabin, C. M.; Meckel, P. T.
1974-01-01
The research is reported in applying the axial fluid temperature differential flowmeter to a urine volume measurement system for space missions. The fluid volume measurement system is described along with the prototype equipment package. Flowmeter calibration, electronic signal processing, and typical void volume measurements are also described.
Zhou, Haiying; Purdie, Jennifer; Wang, Tongtong; Ouyang, Anli
2010-01-01
The number of therapeutic proteins produced by cell culture in the pharmaceutical industry continues to increase. During the early stages of manufacturing process development, hundreds of clones and various cell culture conditions are evaluated to develop a robust process to identify and select cell lines with high productivity. It is highly desirable to establish a high throughput system to accelerate process development and reduce cost. Multiwell plates and shake flasks are widely used in the industry as the scale down model for large-scale bioreactors. However, one of the limitations of these two systems is the inability to measure and control pH in a high throughput manner. As pH is an important process parameter for cell culture, this could limit the applications of these scale down model vessels. An economical, rapid, and robust pH measurement method was developed at Eli Lilly and Company by employing SNARF-4F 5-(-and 6)-carboxylic acid. The method demonstrated the ability to measure the pH values of cell culture samples in a high throughput manner. Based upon the chemical equilibrium of CO(2), HCO(3)(-), and the buffer system, i.e., HEPES, we established a mathematical model to regulate pH in multiwell plates and shake flasks. The model calculates the required %CO(2) from the incubator and the amount of sodium bicarbonate to be added to adjust pH to a preset value. The model was validated by experimental data, and pH was accurately regulated by this method. The feasibility of studying the pH effect on cell culture in 96-well plates and shake flasks was also demonstrated in this study. This work shed light on mini-bioreactor scale down model construction and paved the way for cell culture process development to improve productivity or product quality using high throughput systems. Copyright 2009 American Institute of Chemical Engineers
A simplified conjoint recognition paradigm for the measurement of gist and verbatim memory.
Stahl, Christoph; Klauer, Karl Christoph
2008-05-01
The distinction between verbatim and gist memory traces has furthered the understanding of numerous phenomena in various fields, such as false memory research, research on reasoning and decision making, and cognitive development. To measure verbatim and gist memory empirically, an experimental paradigm and multinomial measurement model has been proposed but rarely applied. In the present article, a simplified conjoint recognition paradigm and multinomial model is introduced and validated as a measurement tool for the separate assessment of verbatim and gist memory processes. A Bayesian metacognitive framework is applied to validate guessing processes. Extensions of the model toward incorporating the processes of phantom recollection and erroneous recollection rejection are discussed.
Robotic tool positioning process using a multi-line off-axis laser triangulation sensor
NASA Astrophysics Data System (ADS)
Pinto, T. C.; Matos, G.
2018-03-01
Proper positioning of a friction stir welding head for pin insertion, driven by a closed chain robot, is important to ensure quality repair of cracks. A multi-line off-axis laser triangulation sensor was designed to be integrated to the robot, allowing relative measurements of the surface to be repaired. This work describes the sensor characteristics, its evaluation and the measurement process for tool positioning to a surface point of interest. The developed process uses a point of interest image and a measured point cloud to define the translation and rotation for tool positioning. Sensor evaluation and tests are described. Keywords: laser triangulation, 3D measurement, tool positioning, robotics.
Performance measurement integrated information framework in e-Manufacturing
NASA Astrophysics Data System (ADS)
Teran, Hilaida; Hernandez, Juan Carlos; Vizán, Antonio; Ríos, José
2014-11-01
The implementation of Internet technologies has led to e-Manufacturing technologies becoming more widely used and to the development of tools for compiling, transforming and synchronising manufacturing data through the Web. In this context, a potential area for development is the extension of virtual manufacturing to performance measurement (PM) processes, a critical area for decision making and implementing improvement actions in manufacturing. This paper proposes a PM information framework to integrate decision support systems in e-Manufacturing. Specifically, the proposed framework offers a homogeneous PM information exchange model that can be applied through decision support in e-Manufacturing environment. Its application improves the necessary interoperability in decision-making data processing tasks. It comprises three sub-systems: a data model, a PM information platform and PM-Web services architecture. A practical example of data exchange for measurement processes in the area of equipment maintenance is shown to demonstrate the utility of the model.
NASA Astrophysics Data System (ADS)
Czarski, T.; Chernyshova, M.; Pozniak, K. T.; Kasprowicz, G.; Byszuk, A.; Juszczyk, B.; Wojenski, A.; Zabolotny, W.; Zienkiewicz, P.
2015-12-01
The measurement system based on GEM - Gas Electron Multiplier detector is developed for X-ray diagnostics of magnetic confinement fusion plasmas. The Triple Gas Electron Multiplier (T-GEM) is presented as soft X-ray (SXR) energy and position sensitive detector. The paper is focused on the measurement subject and describes the fundamental data processing to obtain reliable characteristics (histograms) useful for physicists. So, it is the software part of the project between the electronic hardware and physics applications. The project is original and it was developed by the paper authors. Multi-channel measurement system and essential data processing for X-ray energy and position recognition are considered. Several modes of data acquisition determined by hardware and software processing are introduced. Typical measuring issues are deliberated for the enhancement of data quality. The primary version based on 1-D GEM detector was applied for the high-resolution X-ray crystal spectrometer KX1 in the JET tokamak. The current version considers 2-D detector structures initially for the investigation purpose. Two detector structures with single-pixel sensors and multi-pixel (directional) sensors are considered for two-dimensional X-ray imaging. Fundamental output characteristics are presented for one and two dimensional detector structure. Representative results for reference source and tokamak plasma are demonstrated.
Quality Measures for the Care of Patients with Insomnia
Edinger, Jack D.; Buysse, Daniel J.; Deriy, Ludmila; Germain, Anne; Lewin, Daniel S.; Ong, Jason C.; Morgenthaler, Timothy I.
2015-01-01
The American Academy of Sleep Medicine (AASM) commissioned five Workgroups to develop quality measures to optimize management and care for patients with common sleep disorders including insomnia. Following the AASM process for quality measure development, this document describes measurement methods for two desirable outcomes of therapy, improving sleep quality or satisfaction, and improving daytime function, and for four processes important to achieving these goals. To achieve the outcome of improving sleep quality or satisfaction, pre- and post-treatment assessment of sleep quality or satisfaction and providing an evidence-based treatment are recommended. To realize the outcome of improving daytime functioning, pre- and post-treatment assessment of daytime functioning, provision of an evidence-based treatment, and assessment of treatment-related side effects are recommended. All insomnia measures described in this report were developed by the Insomnia Quality Measures Workgroup and approved by the AASM Quality Measures Task Force and the AASM Board of Directors. The AASM recommends the use of these measures as part of quality improvement programs that will enhance the ability to improve care for patients with insomnia. Citation: Edinger JD, Buysse DJ, Deriy L, Germain A, Lewin DS, Ong JC, Morgenthaler TI. Quality measures for the care of patients with insomnia. J Clin Sleep Med 2015;11(3):311–334. PMID:25700881
NASA Technical Reports Server (NTRS)
Powell, W. B.
1973-01-01
Thrust chamber performance is evaluated in terms of an analytical model incorporating all the loss processes that occur in a real rocket motor. The important loss processes in the real thrust chamber were identified, and a methodology and recommended procedure for predicting real thrust chamber vacuum specific impulse were developed. Simplified equations for the calculation of vacuum specific impulse are developed to relate the delivered performance (both vacuum specific impulse and characteristic velocity) to the ideal performance as degraded by the losses corresponding to a specified list of loss processes. These simplified equations enable the various performance loss components, and the corresponding efficiencies, to be quantified separately (except that interaction effects are arbitrarily assigned in the process). The loss and efficiency expressions presented can be used to evaluate experimentally measured thrust chamber performance, to direct development effort into the areas most likely to yield improvements in performance, and as a basis to predict performance of related thrust chamber configurations.
Development of an evaluation framework for African-European hospital patient safety partnerships.
Rutter, Paul; Syed, Shamsuzzoha B; Storr, Julie; Hightower, Joyce D; Bagheri-Nejad, Sepideh; Kelley, Edward; Pittet, Didier
2014-04-01
Patient safety is recognised as a significant healthcare problem worldwide, and healthcare-associated infections are an important aspect. African Partnerships for Patient Safety is a WHO programme that pairs hospitals in Africa with hospitals in Europe with the objective to work together to improve patient safety. To describe the development of an evaluation framework for hospital-to-hospital partnerships participating in the programme. The framework was structured around the programme's three core objectives: facilitate strong interhospital partnerships, improve in-hospital patient safety and spread best practices nationally. Africa-based clinicians, their European partners and experts in patient safety were closely involved in developing the evaluation framework in an iterative process. The process defined six domains of partnership strength, each with measurable subdomains. We developed a questionnaire to measure these subdomains. Participants selected six indicators of hospital patient safety improvement from a short-list of 22 based on their relevance, sensitivity to intervention and measurement feasibility. Participants proposed 20 measures of spread, which were refined into a two-part conceptual framework, and a data capture tool created. Taking a highly participatory approach that closely involved its end users, we developed an evaluation framework and tools to measure partnership strength, patient safety improvements and the spread of best practice.
NASA Astrophysics Data System (ADS)
Wang, Pengyao; Chen, Xiangguang; Yang, Kai; Liu, Xuejiao
2017-01-01
To improve the measuring efficiency of width and thickness of tire tread in the process of automobile tire production, the actual condition for the tire production process is analyzed, and a fast online measurement system based on moving tire tread of tire specifications is established in this paper. The coordinate data of tire tread profile is acquired by 3D laser sensor, and we use C# language for programming which is an object-oriented programming language to complete the development of client program. The system with laser sensor can provide real-time display of tire tread profile and the data to require in the process of tire production. Experimental results demonstrate that the measuring precision of the system is <= 1mm, it can meet the measurement requirements of the production process, and the system has the characteristics of convenient installation and testing, system stable operation.
Evans, Rachel C; Kyeremateng, Samuel O; Asmus, Lutz; Degenhardt, Matthias; Rosenberg, Joerg; Wagner, Karl G
2018-05-01
The aim of this work was to investigate the use of torasemide as a highly sensitive indicator substance and to develop a formulation thereof for establishing quantitative relationships between hot-melt extrusion process conditions and critical quality attributes (CQAs). Using solid-state characterization techniques and a 10 mm lab-scale co-rotating twin-screw extruder, we studied torasemide in a Soluplus® (SOL)-polyethylene glycol 1500 (PEG 1500) matrix, and developed and characterized a formulation which was used as a process indicator to study thermal- and hydrolysis-induced degradation, as well as residual crystallinity. We found that torasemide first dissolved into the matrix and then degraded. Based on this mechanism, extrudates with measurable levels of degradation and residual crystallinity were produced, depending strongly on the main barrel and die temperature and residence time applied. In addition, we found that 10% w/w PEG 1500 as plasticizer resulted in the widest operating space with the widest range of measurable residual crystallinity and degradant levels. Torasemide as an indicator substance behaves like a challenging-to-process API, only with higher sensitivity and more pronounced effects, e.g., degradation and residual crystallinity. Application of a model formulation containing torasemide will enhance the understanding of the dynamic environment inside an extruder and elucidate the cumulative thermal and hydrolysis effects of the extrusion process. The use of such a formulation will also facilitate rational process development and scaling by establishing clear links between process conditions and CQAs.
A Conceptual and Measurement Framework to Guide Policy Development and Systems Change
ERIC Educational Resources Information Center
Schalock, Robert L.; Verdugo, Miguel Angel
2012-01-01
The authors describe a conceptual and measurement framework that provides a template for guiding policy development and systems change. The framework is built on the concepts of vertical and horizontal alignment, system-level processes, and organization-level practices. Application of the framework can structure the thinking and analytic…
Instrument Development and Validation of the Infant and Toddler Assessment for Quality Improvement
ERIC Educational Resources Information Center
Perlman, Michal; Brunsek, Ashley; Hepditch, Anne; Gray, Karen; Falenchuck, Olesya
2017-01-01
Research Findings: There is a growing need for accurate and efficient measures of classroom quality in early childhood education and care (ECEC) settings. Observational measures are costly, as their administration generally takes 3-5 hr per classroom. This article outlines the process of development and preliminary concurrent validity testing of…
Tung, Li-Chen; Yu, Wan-Hui; Lin, Gong-Hong; Yu, Tzu-Ying; Wu, Chien-Te; Tsai, Chia-Yin; Chou, Willy; Chen, Mei-Hsiang; Hsieh, Ching-Lin
2016-09-01
To develop a Tablet-based Symbol Digit Modalities Test (T-SDMT) and to examine the test-retest reliability and concurrent validity of the T-SDMT in patients with stroke. The study had two phases. In the first phase, six experts, nine college students and five outpatients participated in the development and testing of the T-SDMT. In the second phase, 52 outpatients were evaluated twice (2 weeks apart) with the T-SDMT and SDMT to examine the test-retest reliability and concurrent validity of the T-SDMT. The T-SDMT was developed via expert input and college student/patient feedback. Regarding test-retest reliability, the practise effects of the T-SDMT and SDMT were both trivial (d=0.12) but significant (p≦0.015). The improvement in the T-SDMT (4.7%) was smaller than that in the SDMT (5.6%). The minimal detectable changes (MDC%) of the T-SDMT and SDMT were 6.7 (22.8%) and 10.3 (32.8%), respectively. The T-SDMT and SDMT were highly correlated with each other at the two time points (Pearson's r=0.90-0.91). The T-SDMT demonstrated good concurrent validity with the SDMT. Because the T-SDMT had a smaller practise effect and less random measurement error (superior test-retest reliability), it is recommended over the SDMT for assessing information processing speed in patients with stroke. Implications for Rehabilitation The Symbol Digit Modalities Test (SDMT), a common measure of information processing speed, showed a substantial practise effect and considerable random measurement error in patients with stroke. The Tablet-based SDMT (T-SDMT) has been developed to reduce the practise effect and random measurement error of the SDMT in patients with stroke. The T-SDMT had smaller practise effect and random measurement error than the SDMT, which can provide more reliable assessments of information processing speed.
Thermal Remote Sensing and the Thermodynamics of Ecosystems Development
NASA Technical Reports Server (NTRS)
Luvall, Jeffrey C.; Kay, James J.; Fraser, Roydon F.; Goodman, H. Michael (Technical Monitor)
2001-01-01
Thermal remote sensing can provide environmental measuring tools with capabilities for measuring ecosystem development and integrity. Recent advances in applying principles of nonequilibrium thermodynamics to ecology provide fundamental insights into energy partitioning in ecosystems. Ecosystems are nonequilibrium systems, open to material and energy flows, which grow and develop structures and processes to increase energy degradation. More developed terrestrial ecosystems will be more effective at dissipating the solar gradient (degrading its energy content). This can be measured by the effective surface temperature of the ecosystem on a landscape scale. A series of airborne thermal infrared multispectral scanner data were collected from several forested ecosystems ranging from a western US douglas-fir forest to a tropical rain forest in Costa Rica. Also measured were agriculture systems. These data were used to develop measures of ecosystem development and integrity based on surface temperature.
ERIC Educational Resources Information Center
Coch, Donna; Benoit, Clarisse
2015-01-01
We investigated whether and how standardized behavioral measures of reading and electrophysiological measures of reading were related in 72 typically developing, late elementary school children. Behavioral measures included standardized tests of spelling, phonological processing, vocabulary, comprehension, naming speed, and memory.…
Torque measurement at the single-molecule level.
Forth, Scott; Sheinin, Maxim Y; Inman, James; Wang, Michelle D
2013-01-01
Methods for exerting and measuring forces on single molecules have revolutionized the study of the physics of biology. However, it is often the case that biological processes involve rotation or torque generation, and these parameters have been more difficult to access experimentally. Recent advances in the single-molecule field have led to the development of techniques that add the capability of torque measurement. By combining force, displacement, torque, and rotational data, a more comprehensive description of the mechanics of a biomolecule can be achieved. In this review, we highlight a number of biological processes for which torque plays a key mechanical role. We describe the various techniques that have been developed to directly probe the torque experienced by a single molecule, and detail a variety of measurements made to date using these new technologies. We conclude by discussing a number of open questions and propose systems of study that would be well suited for analysis with torsional measurement techniques.
Distributed collaborative team effectiveness: measurement and process improvement
NASA Technical Reports Server (NTRS)
Wheeler, R.; Hihn, J.; Wilkinson, B.
2002-01-01
This paper describes a measurement methodology developed for assessing the readiness, and identifying opportunities for improving the effectiveness, of distributed collaborative design teams preparing to conduct a coccurent design session.
Development of an Instrument for the Measurement of Leadership Commitment to Organizational Process
ERIC Educational Resources Information Center
Hylton, Peter D.
2013-01-01
The purpose of this research study was to create a new instrument designed to examine the commitment of an organization's leadership to following organizational processes, as measured by stakeholder perceptions. This instrument was designed to aid in closure of a gap in the field of leadership studies relative to the impact that a leader's…
Flow Tube Studies of Gas Phase Chemical Processes of Atmospheric Importance
NASA Technical Reports Server (NTRS)
Molina, Mario J.
1997-01-01
The objective of this project is to conduct measurements of elementary reaction rate constants and photochemistry parameters for processes of importance in the atmosphere. These measurements are being carried out under temperature and pressure conditions covering those applicable to the stratosphere and upper troposphere, using the chemical ionization mass spectrometry turbulent flow technique developed in our laboratory.
ERIC Educational Resources Information Center
Stahl, Robert J.
1986-01-01
Reports the steps taken to develop a satisfactory group measure of the Casteel-Stahl model of cognitive-affect-process education. The resulting 60-item Likert format instrument measures a wide array of instructional outcomes, from empathy, communications, decision making, problem solving and personal consistency to acceptance of self and…
2003-08-09
VANDENBERG AIR FORCE BASE, CALIF. - Workers mate the Pegasus , with its cargo of the SciSat-1 payload to the L-1011 carrier aircraft. The SciSat-1 weighs approximately 330 pounds and after launch will be placed in a 400-mile-high polar orbit to investigate processes that control the distribution of ozone in the upper atmosphere. The data from the satellite will provide Canadian and international scientists with improved measurements relating to global ozone processes and help policymakers assess existing environmental policy and develop protective measures for improving the health of our atmosphere, preventing further ozone depletion. The mission is designed to last two years.
2003-07-29
VANDENBERG AIR FORCE BASE, CALIF. - At Vandenberg AFB, Calif., a solar array is tested before installing on the SciSat-1 spacecraft. The SciSat-1 weighs approximately 330 pounds and after launch will be placed in a 400-mile-high polar orbit to investigate processes that control the distribution of ozone in the upper atmosphere. The data from the satellite will provide Canadian and international scientists with improved measurements relating to global ozone processes and help policymakers assess existing environmental policy and develop protective measures for improving the health of our atmosphere, preventing further ozone depletion. The mission is designed to last two years.
2003-07-29
VANDENBERG AIR FORCE BASE, CALIF. - At Vandenberg AFB, Calif., a solar array is installed on the SciSat-1 spacecraft. The SciSat-1 weighs approximately 330 pounds and after launch will be placed in a 400-mile-high polar orbit to investigate processes that control the distribution of ozone in the upper atmosphere. The data from the satellite will provide Canadian and international scientists with improved measurements relating to global ozone processes and help policymakers assess existing environmental policy and develop protective measures for improving the health of our atmosphere, preventing further ozone depletion. The mission is designed to last two years.
2003-08-09
VANDENBERG AIR FORCE BASE, CALIF. - The SciSat-1 payload and Pegasus launch vehicle are lifted and mated to the L-1011 carrier aircraft. The SciSat-1 weighs approximately 330 pounds and after launch will be placed in a 400-mile-high polar orbit to investigate processes that control the distribution of ozone in the upper atmosphere. The data from the satellite will provide Canadian and international scientists with improved measurements relating to global ozone processes and help policymakers assess existing environmental policy and develop protective measures for improving the health of our atmosphere, preventing further ozone depletion. The mission is designed to last two years.
NASA Astrophysics Data System (ADS)
Glaser, Ulf; Li, Zhichao; Bichmann, Stephan, II; Pfeifer, Tilo
2003-05-01
By China's entry into the WTO, Chinese as well as German companies are facing the question, how to minimize the risk of unfamiliar cooperation partners when developing products. The rise of customer demands concerning quality, product diversity and the reduction of expenses require flexibility and efficiency with reliable component suppliers. In order to build and strengthen sino-german cooperations, a manufacturing control using homogenized and efficient measures to assure high quality is of vital importance. Lack of unifications may cause identical measurements conducted at subcontractors or customers to be carried out with different measurement processes which leads to incomparable results. Rapidly growing company cooperations and simultaneously decreasing of manufacturing scope cause substantial difficulties when coordinating joint quality control activities. "ProSens," a sino-german project consortium consisting of industrial users, technology producers and research institutes, aims at improving selected production processes by: Creation of a homogeneous quality awareness in sino-german cooperations. Sensitization for process accompanying metrology at an early stage of product development. Increase of the process performance by the use of integrated metrology. Reduction of production time and cost. Unification of quality control of complex products by means of efficient measurement strategies and CAD-based inspection planning.
Plasma processing of superconducting radio frequency cavities
NASA Astrophysics Data System (ADS)
Upadhyay, Janardan
The development of plasma processing technology of superconducting radio frequency (SRF) cavities not only provides a chemical free and less expensive processing method, but also opens up the possibility for controlled modification of the inner surfaces of the cavity for better superconducting properties. The research was focused on the transition of plasma etching from two dimensional flat surfaces to inner surfaces of three dimensional (3D) structures. The results could be applicable to a variety of inner surfaces of 3D structures other than SRF cavities. Understanding the Ar/Cl2 plasma etching mechanism is crucial for achieving the desired modification of Nb SRF cavities. In the process of developing plasma etching technology, an apparatus was built and a method was developed to plasma etch a single cell Pill Box cavity. The plasma characterization was done with the help of optical emission spectroscopy. The Nb etch rate at various points of this cavity was measured before processing the SRF cavity. Cylindrical ring-type samples of Nb placed on the inner surface of the outer wall were used to measure the dependence of the process parameters on plasma etching. The measured etch rate dependence on the pressure, rf power, dc bias, temperature, Cl2 concentration and diameter of the inner electrode was determined. The etch rate mechanism was studied by varying the temperature of the outer wall, the dc bias on the inner electrode and gas conditions. In a coaxial plasma reactor, uniform plasma etching along the cylindrical structure is a challenging task due to depletion of the active radicals along the gas flow direction. The dependence of etch rate uniformity along the cylindrical axis was determined as a function of process parameters. The formation of dc self-biases due to surface area asymmetry in this type of plasma and its variation on the pressure, rf power and gas composition was measured. Enhancing the surface area of the inner electrode to reduce the asymmetry was studied by changing the contour of the inner electrode. The optimized contour of the electrode based on these measurements was chosen for SRF cavity processing.
Perspective on the National Aero-Space Plane Program instrumentation development
NASA Technical Reports Server (NTRS)
Bogue, Rodney K.; Erbland, Peter
1993-01-01
A review of the requirement for, and development of, advanced measurement technology for the National Aerospace Plane program is presented. The objective is to discuss the technical need and the program commitment required to ensure that adequate and timely measurement capabilities are provided for ground and flight testing in the NASP program. The scope of the measurement problem is presented, the measurement process is described, how instrumentation technology development has been affected by NASP program evolution is examined, the national effort to define measurement requirements and assess the adequacy of current technology to support the NASP program is discussed, and the measurement requirements are summarized. The unique features of the NASP program that complicate the understanding of requirements and the development of viable solutions are illustrated.
Hasse, J U; Weingaertner, D E
2016-01-01
As the central product of the BMBF-KLIMZUG-funded Joint Network and Research Project (JNRP) 'dynaklim - Dynamic adaptation of regional planning and development processes to the effects of climate change in the Emscher-Lippe region (North Rhine Westphalia, Germany)', the Roadmap 2020 'Regional Climate Adaptation' has been developed by the various regional stakeholders and institutions containing specific regional scenarios, strategies and adaptation measures applicable throughout the region. This paper presents the method, elements and main results of this regional roadmap process by using the example of the thematic sub-roadmap 'Water Sensitive Urban Design 2020'. With a focus on the process support tool 'KlimaFLEX', one of the main adaptation measures of the WSUD 2020 roadmap, typical challenges for integrated climate change adaptation like scattered knowledge, knowledge gaps and divided responsibilities but also potential solutions and promising chances for urban development and urban water management are discussed. With the roadmap and the related tool, the relevant stakeholders of the Emscher-Lippe region have jointly developed important prerequisites to integrate their knowledge, to clarify vulnerabilities, adaptation goals, responsibilities and interests, and to foresightedly coordinate measures, resources, priorities and schedules for an efficient joint urban planning, well-grounded decision-making in times of continued uncertainties and step-by-step implementation of adaptation measures from now on.
NASA Astrophysics Data System (ADS)
Jackisch, Conrad; Allroggen, Niklas
2017-04-01
The missing vision into the subsurface appears to be a major limiting factor for our hydrological process understanding and theory development. Today, hydrology-related sciences have collected tremendous evidence for soils acting as drainage network and retention stores simultaneously in structured and self-organising domains. However, our present observation technology relies mainly on point-scale sensors, which integrate over a volume of unknown structures and is blind for their distribution. Although heterogeneity is acknowledged at all scales, it is rarely seen as inherent system property. At small scales (soil moisture probe) and at large scales (neutron probe) our measurements leave quite some ambiguity. Consequently, spatially and temporally continuous measurement of soil water states is essential for advancing our understanding and development of subsurface process theories. We present results from several irrigation experiments accompanied by 2D and 3D time-lapse GPR for the development of a novel technique to visualise and quantify water dynamics in the subsurface. Through the comparison of TDR, tracer and gravimetric measurement of soil moisture it becomes apparent that all sensor-based techniques are capable to record temporal dynamics, but are challenged to precisely quantify the measurements and to extrapolate them in space. At the same time excavative methods are very limited in temporal and spatial resolution. The application of non-invasive 4D GPR measurements complements the existing techniques and reveals structural and temporal dynamics simultaneously. By consequently increasing the density of the GPR data recordings in time and space, we find means to process the data also in the time-dimension. This opens ways to quantitatively analyse soil water dynamics in complex settings.
Developing Elementary Math and Science Process Skills Through Engineering Design Instruction
NASA Astrophysics Data System (ADS)
Strong, Matthew G.
This paper examines how elementary students can develop math and science process skills through an engineering design approach to instruction. The performance and development of individual process skills overall and by gender were also examined. The study, preceded by a pilot, took place in a grade four extracurricular engineering design program in a public, suburban school district. Students worked in pairs and small groups to design and construct airplane models from styrofoam, paper clips, and toothpicks. The development and performance of process skills were assessed through a student survey of learning gains, an engineering design packet rubric (student work), observation field notes, and focus group notes. The results indicate that students can significantly develop process skills, that female students may develop process skills through engineering design better than male students, and that engineering design is most helpful for developing the measuring, suggesting improvements, and observing process skills. The study suggests that a more regular engineering design program or curriculum could be beneficial for students' math and science abilities both in this school and for the elementary field as a whole.
Wirtz, Sebastian F; Cunha, Adauto P A; Labusch, Marc; Marzun, Galina; Barcikowski, Stephan; Söffker, Dirk
2018-06-01
Today, the demand for continuous monitoring of valuable or safety critical equipment is increasing in many industrial applications due to safety and economical requirements. Therefore, reliable in-situ measurement techniques are required for instance in Structural Health Monitoring (SHM) as well as process monitoring and control. Here, current challenges are related to the processing of sensor data with a high data rate and low latency. In particular, measurement and analyses of Acoustic Emission (AE) are widely used for passive, in-situ inspection. Advantages of AE are related to its sensitivity to different micro-mechanical mechanisms on the material level. However, online processing of AE waveforms is computationally demanding. The related equipment is typically bulky, expensive, and not well suited for permanent installation. The contribution of this paper is the development of a Field Programmable Gate Array (FPGA)-based measurement system using ZedBoard devlopment kit with Zynq-7000 system on chip for embedded implementation of suitable online processing algorithms. This platform comprises a dual-core Advanced Reduced Instruction Set Computer Machine (ARM) architecture running a Linux operating system and FPGA fabric. A FPGA-based hardware implementation of the discrete wavelet transform is realized to accelerate processing the AE measurements. Key features of the system are low cost, small form factor, and low energy consumption, which makes it suitable to serve as field-deployed measurement and control device. For verification of the functionality, a novel automatically realized adjustment of the working distance during pulsed laser ablation in liquids is established as an example. A sample rate of 5 MHz is achieved at 16 bit resolution.
USDA-ARS?s Scientific Manuscript database
Since maturational processes triggering increased attunement to native language features in early infancy are sensitive to dietary factors, infant-diet related differences in brain processing of native-language speech stimuli might indicate variations in onset of this tuning process. We measured cor...
Maturation of Visual and Auditory Temporal Processing in School-Aged Children
ERIC Educational Resources Information Center
Dawes, Piers; Bishop, Dorothy V. M.
2008-01-01
Purpose: To examine development of sensitivity to auditory and visual temporal processes in children and the association with standardized measures of auditory processing and communication. Methods: Normative data on tests of visual and auditory processing were collected on 18 adults and 98 children aged 6-10 years of age. Auditory processes…
Patterson, Olga V; Freiberg, Matthew S; Skanderson, Melissa; J Fodeh, Samah; Brandt, Cynthia A; DuVall, Scott L
2017-06-12
In order to investigate the mechanisms of cardiovascular disease in HIV infected and uninfected patients, an analysis of echocardiogram reports is required for a large longitudinal multi-center study. A natural language processing system using a dictionary lookup, rules, and patterns was developed to extract heart function measurements that are typically recorded in echocardiogram reports as measurement-value pairs. Curated semantic bootstrapping was used to create a custom dictionary that extends existing terminologies based on terms that actually appear in the medical record. A novel disambiguation method based on semantic constraints was created to identify and discard erroneous alternative definitions of the measurement terms. The system was built utilizing a scalable framework, making it available for processing large datasets. The system was developed for and validated on notes from three sources: general clinic notes, echocardiogram reports, and radiology reports. The system achieved F-scores of 0.872, 0.844, and 0.877 with precision of 0.936, 0.982, and 0.969 for each dataset respectively averaged across all extracted values. Left ventricular ejection fraction (LVEF) is the most frequently extracted measurement. The precision of extraction of the LVEF measure ranged from 0.968 to 1.0 across different document types. This system illustrates the feasibility and effectiveness of a large-scale information extraction on clinical data. New clinical questions can be addressed in the domain of heart failure using retrospective clinical data analysis because key heart function measurements can be successfully extracted using natural language processing.
Advanced thermal barrier coatings for operation in high hydrogen content fueled gas turbines.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sampath, Sanjay
2015-04-02
The Center for Thermal Spray Research (CTSR) at Stony Brook University in partnership with its industrial Consortium for Thermal Spray Technology is investigating science and technology related to advanced metallic alloy bond coats and ceramic thermal barrier coatings for applications in the hot section of gasified coal-based high hydrogen turbine power systems. In conjunction with our OEM partners (GE and Siemens) and through strategic partnership with Oak Ridge National Laboratory (ORNL) (materials degradation group and high temperature materials laboratory), a systems approach, considering all components of the TBC (multilayer ceramic top coat, metallic bond coat & superalloy substrate) is beingmore » taken during multi-layered coating design, process development and subsequent environmental testing. Recent advances in process science and advanced in situ thermal spray coating property measurement enabled within CTSR has been incorporated for full-field enhancement of coating and process reliability. The development of bond coat processing during this program explored various aspects of processing and microstructure and linked them to performance. The determination of the bond coat material was carried out during the initial stages of the program. Based on tests conducted both at Stony Brook University as well as those carried out at ORNL it was determined that the NiCoCrAlYHfSi (Amdry) bond coats had considerable benefits over NiCoCrAlY bond coats. Since the studies were also conducted at different cycling frequencies, thereby addressing an associated need for performance under different loading conditions, the Amdry bond coat was selected as the material of choice going forward in the program. With initial investigations focused on the fabrication of HVOF bond coats and the performance of TBC under furnace cycle tests , several processing strategies were developed. Two-layered HVOF bond coats were developed to render optimal balance of density and surface roughness and resulted in improved TBC lifetimes. Processing based approaches of identifying optimal processing regimes deploying advanced in-situ coating property measurements and in-flight diagnostic tools were used to develop process maps for bond coats. Having established a framework for the bond coat processing using the HVOF process, effort were channeled towards fabrication of APS and VPS bond coats with the same material composition. Comparative evaluation of the three deposition processes with regard to their microstrcuture , surface profiles and TBC performance were carried out and provided valuable insights into factors that require concurrent consideration for the development of bond coats for advanced TBC systems. Over the course of this program several advancements were made on the development of durable thermal barrier coatings. Process optimization techniques were utilized to identify processing regimes for both conventional YSZ as well as other TBC compositions such as Gadolinium Zirconate and other Co-doped materials. Measurement of critical properties for these formed the initial stages of the program to identify potential challenges in their implementation as part of a TBC system. High temperature thermal conductivity measurements as well as sintering behavior of both YSZ and GDZ coatings were evaluated as part of initial efforts to undersand the influence of processing on coating properties. By effectively linking fundamental coating properties of fracture toughness and elastic modulus to the cyclic performance of coatings, a durability strategy for APS YSZ coatings was developed. In order to meet the goals of fabricating a multimaterial TBC system further research was carried out on the development of a gradient thermal conductivity model and the evaluation of sintering behavior of multimaterial coatings. Layer optimization for desired properties in the multimaterial TBC was achieved by an iterative feedback approach utilizing process maps and in-situ and ex-situ coating property sensors. Addressing the challenges pertaining to the integration of the two materials YSZ and GDZ led to one of most the critical outcomes of this program, the development of durable multimaterial, multifunctional TBC systems.« less
Task analysis exemplified: the process of resolving unfinished business.
Greenberg, L S; Foerster, F S
1996-06-01
The steps of a task-analytic research program designed to identify the in-session performances involved in resolving lingering bad feelings toward a significant other are described. A rational-empirical methodology of repeatedly cycling between rational conjecture and empirical observations is demonstrated as a method of developing an intervention manual and the components of client processes of resolution. A refined model of the change process developed by these procedures is validated by comparing 11 successful and 11 unsuccessful performances. Four performance components-intense expression of feeling, expression of need, shift in representation of other, and self-validation or understanding of the other-were found to discriminate between resolution and nonresolution performances. These components were measured on 4 process measures: the Structural Analysis of Social Behavior, the Experiencing Scale, the Client's Emotional Arousal Scale, and a need scale.
Evaluation of an attributive measurement system in the automotive industry
NASA Astrophysics Data System (ADS)
Simion, C.
2016-08-01
Measurement System Analysis (MSA) is a critical component for any quality improvement process. MSA is defined as an experimental and mathematical method of determining how much the variation within the measurement process contributes to overall process variability and it falls into two categories: attribute and variable. Most problematic measurement system issues come from measuring attribute data, which are usually the result of human judgment (visual inspection). Because attributive measurement systems are often used in some manufacturing processes, their assessment is important to obtain the confidence in the inspection process, to see where are the problems in order to eliminate them and to guide the process improvement. It was the aim of this paper to address such a issue presenting a case study made in a local company from the Sibiu region supplying products for the automotive industry, specifically the bag (a technical textile component, i.e. the fabric) for the airbag module. Because defects are inherent in every manufacturing process and in the field of airbag systems a minor defect can influence their performance and lives depend on the safety feature, there is a stringent visual inspection required on the defects of the bag material. The purpose of this attribute MSA was: to determine if all inspectors use the same criteria to determine “pass” from “fail” product (i.e. the fabric); to assess company inspection standards against customer's requirements; to determine how well inspectors are conforming to themselves; to identify how inspectors are conforming to a “known master,” which includes: how often operators ship defective product, how often operators dispose of acceptable product; to discover areas where training is required, procedures must be developed and standards are not available. The results were analyzed using MINITAB software with its module called Attribute Agreement Analysis. The conclusion was that the inspection process must be improved by operator training, developing visual aids/boundary samples, establishing standards and set-up procedures.
Measurement of thermal deformation of an engine piston using a conical mirror and ESPI
NASA Astrophysics Data System (ADS)
Albertazzi, Armando, Jr.; Melao, Iza; Devece, Eugenio
1998-07-01
An experimental technique is developed to measure the radial displacement component of cylindrical surfaces using a conical mirror for normal illumination and observation. Single illumination ESPI is used to obtain fringe patterns related to the radial displacement field. Some data processing strategies are presented and discussed to properly extract the measurement data. Data reduction algorithms are developed to quantify and compensate the rigid body displacements: translations and rotations. The displacement component responsible for shape distortion (deformation) can be separated from the total displacement field. The thermal radial deformation of an aluminum engine piston with a steel sash is measured by this technique. A temperature change of about 2 degrees Celsius was applied to the engine piston by means of an electrical wire wrapped up in the first engine piston grove. The fringe patterns are processed and the results are presented as polar graphics and 3D representation. The main advantages and limitations of the developed technique are discussed.
NASA Astrophysics Data System (ADS)
Amado, Antonio; Schmid, Manfred; Wegener, Konrad
2015-05-01
Polymer processing using Additive Manufacturing Technologies (AM) has experienced a remarkable growth during the last years. The application range has been expanding rapidly, particularly driven by the so-called consumer 3D printing sector. However, for applications demanding higher requirements in terms of thermo-mechanical properties and dimensional accuracy the long established AM technologies such as Selective Laser Sintering (SLS) do not depict a comparable development. The higher process complexity hinders the number of materials that can be currently processed and the interactions between the different physics involved have not been fully investigated. In case of thermoplastic materials the crystallization kinetics coupled to the shrinkage strain development strongly influences the stability of the process. Thus, the current investigation presents a transient Finite Element simulation of the warpage effect during the SLS process of a new developed polyolefin (co-polypropylene) coupling the thermal, mechanical and phase change equations that control the process. A thermal characterization of the material was performed by means of DSC, integrating the Nakamura model with the classical Hoffmann-Lauritzen theory. The viscoelastic behavior was measured using a plate-plate rheometer at different degrees of undercooling and a phase change-temperature superposition principle was implemented. Additionally, for validation porpoises the warpage development of the first sintered layers was captured employing an optical device. The simulation results depict a good agreement with experimental measurements of deformation, describing the high sensitivity of the geometrical accuracy of the sintered parts related to the processing conditions.
Pisoni, David B; Kronenberger, William G; Roman, Adrienne S; Geers, Ann E
2011-02-01
Conventional assessments of outcomes in deaf children with cochlear implants (CIs) have focused primarily on endpoint or product measures of speech and language. Little attention has been devoted to understanding the basic underlying core neurocognitive factors involved in the development and processing of speech and language. In this study, we examined the development of factors related to the quality of phonological information in immediate verbal memory, including immediate memory capacity and verbal rehearsal speed, in a sample of deaf children after >10 yrs of CI use and assessed the correlations between these two process measures and a set of speech and language outcomes. Of an initial sample of 180 prelingually deaf children with CIs assessed at ages 8 to 9 yrs after 3 to 7 yrs of CI use, 112 returned for testing again in adolescence after 10 more years of CI experience. In addition to completing a battery of conventional speech and language outcome measures, subjects were administered the Wechsler Intelligence Scale for Children-III Digit Span subtest to measure immediate verbal memory capacity. Sentence durations obtained from the McGarr speech intelligibility test were used as a measure of verbal rehearsal speed. Relative to norms for normal-hearing children, Digit Span scores were well below average for children with CIs at both elementary and high school ages. Improvement was observed over the 8-yr period in the mean longest digit span forward score but not in the mean longest digit span backward score. Longest digit span forward scores at ages 8 to 9 yrs were significantly correlated with all speech and language outcomes in adolescence, but backward digit spans correlated significantly only with measures of higher-order language functioning over that time period. While verbal rehearsal speed increased for almost all subjects between elementary grades and high school, it was still slower than the rehearsal speed obtained from a control group of normal-hearing adolescents. Verbal rehearsal speed at ages 8 to 9 yrs was also found to be strongly correlated with speech and language outcomes and Digit Span scores in adolescence. Despite improvement after 8 additional years of CI use, measures of immediate verbal memory capacity and verbal rehearsal speed, which reflect core fundamental information processing skills associated with representational efficiency and information processing capacity, continue to be delayed in children with CIs relative to NH peers. Furthermore, immediate verbal memory capacity and verbal rehearsal speed at 8 to 9 yrs of age were both found to predict speech and language outcomes in adolescence, demonstrating the important contribution of these processing measures for speech-language development in children with CIs. Understanding the relations between these core underlying processes and speech-language outcomes in children with CIs may help researchers to develop new approaches to intervention and treatment of deaf children who perform poorly with their CIs. Moreover, this knowledge could be used for early identification of deaf children who may be at high risk for poor speech and language outcomes after cochlear implantation as well as for the development of novel targeted interventions that focus selectively on these core elementary information processing variables.
Listening Effort Through Depth of Processing in School-Age Children.
Hsu, Benson Cheng-Lin; Vanpoucke, Filiep; van Wieringen, Astrid
A reliable and practical measure of listening effort is crucial in the aural rehabilitation of children with communication disorders. In this article, we propose a novel behavioral paradigm designed to measure listening effort in school-age children based on different depths and levels of verbal processing. The paradigm consists of a classic word recognition task performed in quiet and in noise coupled to one of three additional tasks asking the children to judge the color of simple pictures or a certain semantic category of the presented words. The response time (RT) from the categorization tasks is considered the primary indicator of listening effort. The listening effort paradigm was evaluated in a group of 31 normal-hearing, normal-developing children 7 to 12 years of age. A total of 146 Dutch nouns were selected for the experiment after surveying 14 local Dutch-speaking children. Windows-based custom software was developed to administer the behavioral paradigm from a conventional laptop computer. A separate touch screen was used as a response interface to gather the RT data from the participants. Verbal repetition of each presented word was scored by the tester and a percentage-correct word recognition score (WRS) was calculated for each condition. Randomized lists of target words were presented in one of three signal to noise ratios (SNR) to examine the effect of background noise on the two outcome measures of WRS and RT. Three novel categorization tasks, each corresponding to a different depth or elaboration level of semantic processing, were developed to examine the effect of processing level on either WRS or RT. It was hypothesized that, while listening effort as measured by RT would be affected by both noise and processing level, WRS performance would be affected by changes in noise level only. The RT measure was also hypothesized to increase more from an increase in noise level in categorization conditions demanding a deeper or more elaborate form of semantic processing. There was a significant effect of SNR level on school-age children's WRS: their word recognition performance tended to decrease with increasing background noise level. However, depth of processing did not seem to affect WRS. Moreover, a repeated-measure analysis of variance fitted to transformed RT data revealed that this measure of listening effort in normal-hearing school-age children was significantly affected by both SNR level and the depth of semantic processing. There was no significant interaction between noise level and the type of categorization task with regard to RT. The observed patterns of WRS and RT supported the hypotheses regarding the effects of background noise and depth of processing on word recognition performance and a behavioral measure of listening effort. The magnitude of noise-induced change in RT did not differ between categorization tasks, however. Our findings point to future research directions regarding the potential effects of age, working memory capacity, and cross-modality interaction when measuring listening effort in different levels of semantic processing.
Adaptive strategy for joint measurements
NASA Astrophysics Data System (ADS)
Uola, Roope; Luoma, Kimmo; Moroder, Tobias; Heinosaari, Teiko
2016-08-01
We develop a technique to find simultaneous measurements for noisy quantum observables in finite-dimensional Hilbert spaces. We use the method to derive lower bounds for the noise needed to make incompatible measurements jointly measurable. Using our strategy together with recent developments in the field of one-sided quantum information processing we show that the attained lower bounds are tight for various symmetric sets of quantum measurements. We use this characterisation to prove the existence of so called 4-Specker sets, i.e. sets of four incompatible observables with compatible subsets in the qubit case.
Low Cost Coherent Doppler Lidar Data Acquisition and Processing
NASA Technical Reports Server (NTRS)
Barnes, Bruce W.; Koch, Grady J.
2003-01-01
The work described in this paper details the development of a low-cost, short-development time data acquisition and processing system for a coherent Doppler lidar. This was done using common laboratory equipment and a small software investment. This system provides near real-time wind profile measurements. Coding flexibility created a very useful test bed for new techniques.
Developing the Scale of Teacher Self-Efficacy in Teaching Process
ERIC Educational Resources Information Center
Korkmaz, Fahrettin; Unsal, Serkan
2016-01-01
The purpose of this study is to develop a reliable and valid measurement tool which will reveal teachers' self-competence in education process. Participants of the study are 300 teachers working at state primary schools in the province of Gaziantep. Results of the exploratory factor analysis administered to the scale in order to determine its…
The Development of a Student Survey on Attitudes towards Mathematics Teaching-Learning Processes
ERIC Educational Resources Information Center
Mutohir, Toho Cholik; Lowrie, Tom; Patahuddin, Sitti Maesuri
2018-01-01
This study aimed to develop a survey instrument to measure student attitudes towards mathematics teaching-learning processes that is appropriate for the Indonesian context. This study consisted of two phases: Phase 1 (n = 320) was a pilot study to assess the suitability of the instrument items for Indonesian students. Phase 2 (n = 1001) was…
SCOOP: A Measurement and Database of Student Online Search Behavior and Performance
ERIC Educational Resources Information Center
Zhou, Mingming
2015-01-01
The ability to access and process massive amounts of online information is required in many learning situations. In order to develop a better understanding of student online search process especially in academic contexts, an online tool (SCOOP) is developed for tracking mouse behavior on the web to build a more extensive account of student web…
ERIC Educational Resources Information Center
Fukuda, Makiko
2014-01-01
The present study revealed the dynamic process of speech development in a domestic immersion program by seven adult beginning learners of Japanese. The speech data were analyzed with fluency, accuracy, and complexity measurements at group, interindividual, and intraindividual levels. The results revealed the complex nature of language development…
Recommended Practices in Thrust Measurements
NASA Technical Reports Server (NTRS)
Polk, James E.; Pancotti, Anthony; Haag, Thomas; King, Scott; Walker, Mitchell; Blakely, Joseph; Ziemer, John
2013-01-01
Accurate, direct measurement of thrust or impulse is one of the most critical elements of electric thruster characterization, and one of the most difficult measurements to make. The American Institute of Aeronautics and Astronautics has started an initiative to develop standards for many important measurement processes in electric propulsion, including thrust measurements. This paper summarizes recommended practices for the design, calibration, and operation of pendulum thrust stands, which are widely recognized as the best approach for measuring micro N- to mN-level thrust and micro Ns-level impulse bits. The fundamentals of pendulum thrust stand operation are reviewed, along with its implementation in hanging pendulum, inverted pendulum, and torsional balance configurations. Methods of calibration and recommendations for calibration processes are presented. Sources of error are identified and methods for data processing and uncertainty analysis are discussed. This review is intended to be the first step toward a recommended practices document to help the community produce high quality thrust measurements.
IMM estimator with out-of-sequence measurements
NASA Astrophysics Data System (ADS)
Bar-Shalom, Yaakov; Chen, Huimin
2004-08-01
In multisensor tracking systems that operate in a centralized information processing architecture, measurements from the same target obtained by different sensors can arrive at the processing center out of sequence. In order to avoid either a delay in the output or the need for reordering and reprocessing an entire sequence of measurements, such measurements have to be processed as out-of-sequence measurements (OOSM). Recent work developed procedures for incorporating OOSMs into a Kalman filter (KF). Since the state of the art tracker for real (maneuvering) targets is the Interacting Multiple Model (IMM) estimator, this paper presents the algorithm for incorporating OOSMs into an IMM estimator. Both data association and estimation are considered. Simulation results are presented for two realistic problems using measurements from two airborne GMTI sensors. It is shown that the proposed algorithm for incorporating OOSMs into an IMM estimator yields practically the same performance as the reordering and in-sequence reprocessing of the measurements.
AAL service development loom--from the idea to a marketable business model.
Kriegel, Johannes; Auinger, Klemens
2015-01-01
The Ambient Assisted Living (AAL) market is still in an early stage of development. Previous approaches of comprehensive AAL services are mostly supply-side driven and focused on hardware and software. Usually this type of AAL solutions does not lead to a sustainable success on the market. Research and development increasingly focuses on demand and customer requirements in addition to the social and legal framework. The question is: How can a systematic performance measurement strategy along a service development process support the market-ready design of a concrete business model for AAL service? Within the EU funded research project DALIA (Assistant for Daily Life Activities at Home) an iterative service development process uses an adapted Osterwalder business model canvas. The application of a performance measurement index (PMI) to support the process has been developed and tested. Development of an iterative service development model using a supporting PMI. The PMI framework is developed throughout the engineering of a virtual assistant (AVATAR) as a modular interface to connect informal carers with necessary and useful services. Future research should seek to ensure that the PMI enables meaningful transparency regarding targeting (e.g. innovative AAL service), design (e.g. functional hybrid AAL service) and implementation (e.g. marketable AAL support services). To this end, a further reference to further testing practices is required. The aim must be to develop a weighted PMI in the context of further research, which supports both the service engineering and the subsequent service management process.
Preliminary development of digital signal processing in microwave radiometers
NASA Technical Reports Server (NTRS)
Stanley, W. D.
1980-01-01
Topics covered involve a number of closely related tasks including: the development of several control loop and dynamic noise model computer programs for simulating microwave radiometer measurements; computer modeling of an existing stepped frequency radiometer in an effort to determine its optimum operational characteristics; investigation of the classical second order analog control loop to determine its ability to reduce the estimation error in a microwave radiometer; investigation of several digital signal processing unit designs; initiation of efforts to develop required hardware and software for implementation of the digital signal processing unit; and investigation of the general characteristics and peculiarities of digital processing noiselike microwave radiometer signals.
NASA Technical Reports Server (NTRS)
O'Connor, Brian; Hernandez, Deborah; Hornsby, Linda; Brown, Maria; Horton-Mullins, Kathryn
2017-01-01
Outline: Background of ISS (International Space Station) Material Science Research Rack; NASA SCA (Sample Cartridge Assembly) Design; GEDS (Gravitational Effects in Distortion in Sintering) Experiment Ampoule Design; Development Testing Summary; Thermal Modeling and Analysis. Summary: GEDS design development challenging (GEDS Ampoule design developed through MUGS (Microgravity) testing; Short duration transient sample processing; Unable to measure sample temperatures); MUGS Development testing used to gather data (Actual LGF (Low Gradient Furnace)-like furnace response; Provided sample for sintering evaluation); Transient thermal model integral to successful GEDS experiment (Development testing provided furnace response; PI (Performance Indicator) evaluation of sintering anchored model evaluation of processing durations; Thermal transient model used to determine flight SCA sample processing profiles).
Lambertus, Gordon; Shi, Zhenqi; Forbes, Robert; Kramer, Timothy T; Doherty, Steven; Hermiller, James; Scully, Norma; Wong, Sze Wing; LaPack, Mark
2014-01-01
An on-line analytical method based on transmission near-infrared spectroscopy (NIRS) for the quantitative determination of water concentrations (in parts per million) was developed and applied to the manufacture of a pharmaceutical intermediate. Calibration models for water analysis, built at the development site and applied at the manufacturing site, were successfully demonstrated during six manufacturing runs at a 250-gallon scale. The water measurements will be used as a forward-processing control point following distillation of a toluene product solution prior to use in a Grignard reaction. The most significant impact of using this NIRS-based process analytical technology (PAT) to replace off-line measurements is the significant reduction in the risk of operator exposure through the elimination of sampling of a severely lachrymatory and mutagenic compound. The work described in this report illustrates the development effort from proof-of-concept phase to manufacturing implementation.
Steele, Ann; Karmiloff-Smith, Annette; Cornish, Kim; Scerif, Gaia
2012-11-01
Attention is construed as multicomponential, but the roles of its distinct subfunctions in shaping the broader developing cognitive landscape are poorly understood. The current study assessed 3- to 6-year-olds (N=83) to: (a) trace developmental trajectories of attentional processes and their structure in early childhood and (b) measure the impact of distinct attention subfunctions on concurrent and longitudinal abilities related to literacy and numeracy. Distinct trajectories across attention measures revealed the emergence of 2 attentional factors, encompassing "executive" and "sustained-selective" processes. Executive attention predicted concurrent abilities across domains at Time 1, whereas sustained-selective attention predicted basic numeracy 1 year later. These concurrent and longitudinal constraints cast a broader light on the unfolding relations between domain-general and domain-specific processes over early childhood. © 2012 The Authors. Child Development © 2012 Society for Research in Child Development, Inc.
Dodge, Kenneth A.; Lansford, Jennifer E.; Burks, Virginia Salzer; Bates, John E.; Pettit, Gregory S.; Fontaine, Reid; Price, Joseph M.
2009-01-01
The relation between social rejection and growth in antisocial behavior was investigated. In Study 1, 259 boys and girls (34% African American) were followed from Grades 1 to 3 (ages 6–8 years) to Grades 5 to 7 (ages 10–12 years). Early peer rejection predicted growth in aggression. In Study 2, 585 boys and girls (16% African American) were followed from kindergarten to Grade 3 (ages 5–8 years), and findings were replicated. Furthermore, early aggression moderated the effect of rejection, such that rejection exacerbated antisocial development only among children initially disposed toward aggression. In Study 3, social information-processing patterns measured in Study 1 were found to mediate partially the effect of early rejection on later aggression. In Study 4, processing patterns measured in Study 2 replicated the mediation effect. Findings are integrated into a recursive model of antisocial development. PMID:12705561
NASA Astrophysics Data System (ADS)
Romanosky, Robert R.
2017-05-01
he National Energy Technology Laboratory (NETL) under the Department of Energy (DOE) Fossil Energy (FE) Program is leading the effort to not only develop near zero emission power generation systems, but to increaser the efficiency and availability of current power systems. The overarching goal of the program is to provide clean affordable power using domestic resources. Highly efficient, low emission power systems can have extreme conditions of high temperatures up to 1600 oC, high pressures up to 600 psi, high particulate loadings, and corrosive atmospheres that require monitoring. Sensing in these harsh environments can provide key information that directly impacts process control and system reliability. The lack of suitable measurement technology serves as a driver for the innovations in harsh environment sensor development. Advancements in sensing using optical fibers are key efforts within NETL's sensor development program as these approaches offer the potential to survive and provide critical information about these processes. An overview of the sensor development supported by the National Energy Technology Laboratory (NETL) will be given, including research in the areas of sensor materials, designs, and measurement types. New approaches to intelligent sensing, sensor placement and process control using networked sensors will be discussed as will novel approaches to fiber device design concurrent with materials development research and development in modified and coated silica and sapphire fiber based sensors. The use of these sensors for both single point and distributed measurements of temperature, pressure, strain, and a select suite of gases will be addressed. Additional areas of research includes novel control architecture and communication frameworks, device integration for distributed sensing, and imaging and other novel approaches to monitoring and controlling advanced processes. The close coupling of the sensor program with process modeling and control will be discussed for the overarching goal of clean power production.
Hunter, Linda; Myles, Joanne; Worthington, James R; Lebrun, Monique
2011-01-01
This article discusses the background and process for developing a multi-year corporate quality plan. The Ottawa Hospital's goal is to be a top 10% performer in quality and patient safety in North America. In order to create long-term measurable and sustainable changes in the quality of patient care, The Ottawa Hospital embarked on the development of a three-year strategic corporate quality plan. This was accomplished by engaging the organization at all levels and defining quality frameworks, aligning with internal and external expectations, prioritizing strategic goals, articulating performance measurements and reporting to stakeholders while maintaining a transparent communication process. The plan was developed through an iterative process that engaged a broad base of health professionals, physicians, support staff, administration and senior management. A literature review of quality frameworks was undertaken, a Quality Plan Working Group was established, 25 key stakeholder interviews were conducted and 48 clinical and support staff consultations were held. The intent was to gather information on current quality initiatives and challenges encountered and to prioritize corporate goals and then create the quality plan. Goals were created and then prioritized through an affinity exercise. Action plans were developed for each goal and included objectives, tasks and activities, performance measures (structure, process and outcome), accountabilities and timelines. This collaborative methodology resulted in the development of a three-year quality plan. Six corporate goals were outlined by the tenets of the quality framework for The Ottawa Hospital: access to care, appropriate care (effective and efficient), safe care and satisfaction with care. Each of the six corporate goals identified objectives and supporting action plans with accountabilities outlining what would be accomplished in years one, two and three. The three-year quality plan was approved by senior management and the board in April 2009. This process has supported The Ottawa Hospital's journey of excellence through the creation of a quality plan that will enable long-term measurable and sustainable changes in the quality of patient care. It also engaged healthcare providers who aim to achieve more measured quality patient care, engaged practitioners through collaboration resulting in both alignment of goals and outcomes and allowed for greater commitment by those responsible for achieving quality goals.
Bonow, Robert O; Douglas, Pamela S; Buxton, Alfred E; Cohen, David J; Curtis, Jeptha P; Delong, Elizabeth; Drozda, Joseph P; Ferguson, T Bruce; Heidenreich, Paul A; Hendel, Robert C; Masoudi, Frederick A; Peterson, Eric D; Taylor, Allen J
2011-09-27
Consistent with the growing national focus on healthcare quality, the American College of Cardiology Foundation (ACCF) and the American Heart Association (AHA) have taken a leadership role over the past decade in developing measures of the quality of cardiovascular care by convening a joint ACCF/AHA Task Force on Performance Measures. The Task Force is charged with identifying the clinical topics appropriate for the development of performance measures and with assembling writing committees composed of clinical and methodological experts in collaboration with appropriate subspecialty societies. The Task Force has also created methodology documents that offer guidance in the development of process, outcome, composite, and efficiency measures. Cardiovascular performance measures using existing ACCF/AHA methodology are based on Class I or Class III guidelines recommendations, usually with Level A evidence. These performance measures, based on evidence-based ACCF/AHA guidelines, remain the most rigorous quality measures for both internal quality improvement and public reporting. However, many of the tools for diagnosis and treatment of cardiovascular disease involve advanced technologies, such as cardiac imaging, for which there are often no underlying guideline documents. Because these technologies affect the quality of cardiovascular care and also have the potential to contribute to cardiovascular health expenditures, there is a need for more critical assessment of the use of technology, including the development of quality and performance measures in areas in which guideline recommendations are absent. The evaluation of quality in the use of cardiovascular technologies requires consideration of multiple parameters that differ from other healthcare processes. The present document describes methodology for development of 2 new classes of quality measures in these situations, appropriate use measures and structure/safety measures. Appropriate use measures are based on specific indications, processes, or parameters of care for which high level of evidence data and Class I or Class III guideline recommendations may be lacking but are addressed in ACCF appropriate use criteria documents. Structure/safety measures represent measures developed to address structural aspects of the use of healthcare technology (e.g., laboratory accreditation, personnel training, and credentialing) or quality issues related to patient safety when there are neither guidelines recommendations nor appropriate use criteria. Although the strength of evidence for appropriate use measures and structure/safety measures may not be as strong as that for formal performance measures, they are quality measures that are otherwise rigorously developed, reviewed, tested, and approved in the same manner as ACCF/AHA performance measures. The ultimate goal of the present document is to provide direction in defining and measuring the appropriate use-avoiding not only underuse but also overuse and misuse-and proper application of cardiovascular technology and to describe how such appropriate use measures and structure/safety measures might be developed for the purposes of quality improvement and public reporting. It is anticipated that this effort will help focus the national dialogue on the use of cardiovascular technology and away from the current concerns about volume and cost alone to a more holistic emphasis on value. Copyright © 2011 American College of Cardiology Foundation and the American Heart Association, Inc. Published by Elsevier Inc. All rights reserved.
Methods for Evaluating Emotions Evoked by Food Experiences: A Literature Review
Kaneko, Daisuke; Toet, Alexander; Brouwer, Anne-Marie; Kallen, Victor; van Erp, Jan B. F.
2018-01-01
Besides sensory characteristics of food, food-evoked emotion is a crucial factor in predicting consumer's food preference and therefore in developing new products. Many measures have been developed to assess food-evoked emotions. The aim of this literature review is (i) to give an exhaustive overview of measures used in current research and (ii) to categorize these methods along measurement level (physiological, behavioral, and cognitive) and emotional processing level (unconscious sensory, perceptual/early cognitive, and conscious/decision making) level. This 3 × 3 categorization may help researchers to compile a set of complementary measures (“toolbox”) for their studies. We included 101 peer-reviewed articles that evaluate consumer's emotions and were published between 1997 and 2016, providing us with 59 different measures. More than 60% of these measures are based on self-reported, subjective ratings and questionnaires (cognitive measurement level) and assess the conscious/decision-making level of emotional processing. This multitude of measures and their overrepresentation in a single category hinders the comparison of results across studies and building a complete multi-faceted picture of food-evoked emotions. We recommend (1) to use widely applied, validated measures only, (2) to refrain from using (highly correlated) measures from the same category but use measures from different categories instead, preferably covering all three emotional processing levels, and (3) to acquire and share simultaneously collected physiological, behavioral, and cognitive datasets to improve the predictive power of food choice and other models. PMID:29937744
Development of a battery status monitor
NASA Technical Reports Server (NTRS)
Zimmerman, R. I.
1974-01-01
A prototype battery status monitor system has been developed. The functions of the system are: (1) to provide the energy status of the battery, (2) to measure and transmit basic battery parameters, (3) to process these measurements required to determine abnormal functioning of the battery, and (4) to transmit warning signals of the abnormal condition along with a go/no go signal. The system was developed for use with the space shuttle.
Methods of measurement for semiconductor materials, process control, and devices
NASA Technical Reports Server (NTRS)
Bullis, W. M. (Editor)
1972-01-01
Activities directed toward the development of methods of measurement for semiconductor materials, process control, and devices are described. Topics investigated include: measurements of transistor delay time; application of the infrared response technique to the study of radiation-damaged, lithium-drifted silicon detectors; and identification of a condition that minimizes wire flexure and reduces the failure rate of wire bonds in transistors and integrated circuits under slow thermal cycling conditions. Supplementary data concerning staff, standards committee activities, technical services, and publications are included as appendixes.
Development of fast measurements of concentration of NORM U-238 by HPGe
NASA Astrophysics Data System (ADS)
Cha, Seokki; Kim, Siu; Kim, Geehyun
2017-02-01
Naturally Occureed Radioactive Material (NORM) generated from the origin of earth can be found all around us and even people who are not engaged in the work related to radiation have been exposed to unnecessary radiation. This NORM has a potential risk provided that is concentrated or transformed by artificial activities. Likewise, a development of fast measruement method of NORM is emerging to prevent the radiation exposure of the general public and person engaged in the work related to the type of business related thereto who uses the material in which NORM is concentrated or transfromed. Based on such a background, many of countries have tried to manage NORM and carried out regulatory legislation. To effienctly manage NORM, there is need for developing new measurement to quickly and accurately analyze the nuclide and concentration. In this study, development of the fast and reliable measurement was carried out. In addition to confirming the reliability of the fast measurement, we have obtained results that can suggest the possibility of developing another fast measurement. Therefore, as a follow-up, it is possible to develop another fast analytical measurement afterwards. The results of this study will be very useful for the regulatory system to manage NORM. In this study, a review of two indirect measurement methods of NORM U-238 that has used HPGe on the basis of the equilibrium theory of relationships of mother and daughter nuclide at decay-chain of NORM U-238 has been carried out. For comparative study(in order to know reliabily), direct measurement that makes use of alpha spectrometer with complicated pre-processing process was implemented.
2003-11-05
KENNEDY SPACE CENTER, FLA. - In the Space Station Processing Facility, a technician takes readings for pre-assembly measurements on the Japanese Experiment Module (JEM). Developed by the Japan Aerospace Exploration Agency (JAXA), the JEM will enhance the unique research capabilities of the orbiting complex by providing an additional environment for astronauts to conduct science experiments.
2003-11-05
KENNEDY SPACE CENTER, FLA. - In the Space Station Processing Facility, technicians begin pre-assembly measurements on the Japanese Experiment Module (JEM). Developed by the Japan Aerospace Exploration Agency (JAXA), the JEM will enhance the unique research capabilities of the orbiting complex by providing an additional environment for astronauts to conduct science experiments.
2003-11-05
KENNEDY SPACE CENTER, FLA. - In the Space Station Processing Facility, technicians take readings for pre-assembly measurements on the Japanese Experiment Module (JEM). Developed by the Japan Aerospace Exploration Agency (JAXA), the JEM will enhance the unique research capabilities of the orbiting complex by providing an additional environment for astronauts to conduct science experiments.
2003-11-05
KENNEDY SPACE CENTER, FLA. - In the Space Station Processing Facility, the Japanese Experiment Module (JEM) rests on a workstand during pre-assembly measurement activities. Developed by the Japan Aerospace Exploration Agency (JAXA), the JEM will enhance the unique research capabilities of the orbiting complex by providing an additional environment for astronauts to conduct science experiments.
Synaptogenesis and heritable aspects of executive attention.
Fossella, John A; Sommer, Tobias; Fan, Jin; Pfaff, Don; Posner, Michael I
2003-01-01
In humans, changes in brain structure and function can be measured non-invasively during postnatal development. In animals, advanced optical imaging measures can track the formation of synapses during learning and behavior. With the recent progress in these technologies, it is appropriate to begin to assess how the physiological processes of synapse, circuit, and neural network formation relate to the process of cognitive development. Of particular interest is the development of executive function, which develops more gradually in humans. One approach that has shown promise is molecular genetics. The completion of the human genome project and the human genome diversity project make it straightforward to ask whether variation in a particular gene correlates with variation in behavior, brain structure, brain activity, or all of the above. Strategies that unify the wealth of biochemical knowledge pertaining to synapse formation with the functional measures of brain structure and activity may lead to new insights in developmental cognitive psychology. Copyright 2003 Wiley-Liss, Inc.
Autonomous sensor particle for parameter tracking in large vessels
NASA Astrophysics Data System (ADS)
Thiele, Sebastian; Da Silva, Marco Jose; Hampel, Uwe
2010-08-01
A self-powered and neutrally buoyant sensor particle has been developed for the long-term measurement of spatially distributed process parameters in the chemically harsh environments of large vessels. One intended application is the measurement of flow parameters in stirred fermentation biogas reactors. The prototype sensor particle is a robust and neutrally buoyant capsule, which allows free movement with the flow. It contains measurement devices that log the temperature, absolute pressure (immersion depth) and 3D-acceleration data. A careful calibration including an uncertainty analysis has been performed. Furthermore, autonomous operation of the developed prototype was successfully proven in a flow experiment in a stirred reactor model. It showed that the sensor particle is feasible for future application in fermentation reactors and other industrial processes.
Implicit cognitive processes in psychopathology: an introduction.
Wiers, Reinout W; Teachman, Bethany A; De Houwer, Jan
2007-06-01
Implicit or automatic processes are important in understanding the etiology and maintenance of psychopathological problems. In order to study implicit processes in psychopathology, measures are needed that are valid and reliable when applied to clinical problems. One of the main topics in this special issue concerns the development and validation of new or modified implicit tests in different domains of psychopathology. The other main topic concerns the prediction of clinical outcomes and new ways to directly influence implicit processes in psychopathology. We summarize the contributions to this special issue and discuss how they further our knowledge of implicit processes in psychopathology and how to measure them.
Procurement performance measurement system in the health care industry.
Kumar, Arun; Ozdamar, Linet; Ng, Chai Peng
2005-01-01
The rising operating cost of providing healthcare is of concern to health care providers. As such, measurement of procurement performance will enable competitive advantage and provide a framework for continuous improvement. The objective of this paper is to develop a procurement performance measurement system. The paper reviews the existing literature in procurement performance measurement to identify the key areas of purchasing performance. By studying the three components in the supply chain collectively with the resources, procedures and output, a model is been developed. Additionally, a balanced scorecard is proposed by establishing a set of generic measures and six perspectives. A case study conducted at the Singapore Hospital applies the conceptual model to describe the purchasing department and the activities within and outside the department. The results indicate that the material management department has already made a good start in measuring the procurement process through the implementation of the balanced scorecard. There are many data that are collected but not properly collated and utilized. Areas lacking measurement include cycle time of delivery, order processing time, effectiveness, efficiency and reliability. Though a lot of hard work was involved, the advantages of establishing a measurement system outweigh the costs and efforts involved in its implementation. Results of balanced scorecard measurements provide decision-makers with critical information on efficiency and effectiveness of the purchasing department's work. The measurement model developed could be used for any hospital procurement system.
Methods of measurement for semiconductor materials, process control, and devices
NASA Technical Reports Server (NTRS)
Bullis, W. M. (Editor)
1973-01-01
This progress report describes NBS activities directed toward the development of methods of measurement for semiconductor materials, process control, and devices. Significant accomplishments during this reporting period include design of a plan to provide standard silicon wafers for four-probe resistivity measurements for the industry, publication of a summary report on the photoconductive decay method for measuring carrier lifetime, publication of a comprehensive review of the field of wire bond fabrication and testing, and successful completion of organizational activity leading to the establishment of a new group on quality and hardness assurance in ASTM Committee F-1 on Electronics. Work is continuing on measurement of resistivity of semiconductor crystals; characterization of generation-recombination-trapping centers in silicon; study of gold-doped silicon; development of the infrared response technique; evaluation of wire bonds and die attachment; and measurement of thermal properties of semiconductor devices, delay time and related carrier transport properties in junction devices, and noise properties of microwave diodes.
Development an Instrument to Measure University Students' Attitude towards E-Learning
ERIC Educational Resources Information Center
Mehra, Vandana; Omidian, Faranak
2012-01-01
The study of student's attitude towards e-learning can in many ways help managers better prepare in light of e-learning for the future. This article describes the process of the development of an instrument to measure university students' attitude towards e-learning. The scale was administered to 200 University students from two countries (India…
ERIC Educational Resources Information Center
Bass, Kristin M.; Drits-Esser, Dina; Stark, Louisa A.
2016-01-01
The credibility of conclusions made about the effectiveness of educational interventions depends greatly on the quality of the assessments used to measure learning gains. This essay, intended for faculty involved in small-scale projects, courses, or educational research, provides a step-by-step guide to the process of developing, scoring, and…
ERIC Educational Resources Information Center
Galperin, Bella L.; Tabak, Filiz; Kaynama, Shohreh A.; Ghannadian, F. Frank
2017-01-01
The authors examine the development and implementation of measures for the 2013 Association to Advance Collegiate Schools of Business (AACSB) thematic dimensions of innovation, engagement, and impact (IEI) at two AACSB-accredited business schools. Agreement exists among faculty in Study 1 that IEI should be viewed as interrelated and overlapping…
Conjoint-measurement framework for the study of probabilistic information processing.
NASA Technical Reports Server (NTRS)
Wallsten, T. S.
1972-01-01
The theory of conjoint measurement described by Krantz et al. (1971) is shown to indicate how a descriptive model of human processing of probabilistic information built around Bayes' rule is to be tested and how it is to be used to obtain subjective scale values. Specific relationships concerning these scale values are shown to emerge, and the theoretical prospects resulting from this development are discussed.
NASA Astrophysics Data System (ADS)
Berthias, F.; Feketeová, L.; Della Negra, R.; Dupasquier, T.; Fillol, R.; Abdoul-Carime, H.; Farizon, B.; Farizon, M.; Märk, T. D.
2018-01-01
The combination of the Dispositif d'Irradiation d'Agrégats Moléculaire with the correlated ion and neutral time of flight-velocity map imaging technique provides a new way to explore processes occurring subsequent to the excitation of charged nano-systems. The present contribution describes in detail the methods developed for the quantitative measurement of branching ratios and cross sections for collision-induced dissociation processes of water cluster nano-systems. These methods are based on measurements of the detection efficiency of neutral fragments produced in these dissociation reactions. Moreover, measured detection efficiencies are used here to extract the number of neutral fragments produced for a given charged fragment.
Risbrough, Victoria B; Glenn, Daniel E; Baker, Dewleen G
The use of quantitative, laboratory-based measures of threat in humans for proof-of-concept studies and target development for novel drug discovery has grown tremendously in the last 2 decades. In particular, in the field of posttraumatic stress disorder (PTSD), human models of fear conditioning have been critical in shaping our theoretical understanding of fear processes and importantly, validating findings from animal models of the neural substrates and signaling pathways required for these complex processes. Here, we will review the use of laboratory-based measures of fear processes in humans including cued and contextual conditioning, generalization, extinction, reconsolidation, and reinstatement to develop novel drug treatments for PTSD. We will primarily focus on recent advances in using behavioral and physiological measures of fear, discussing their sensitivity as biobehavioral markers of PTSD symptoms, their response to known and novel PTSD treatments, and in the case of d-cycloserine, how well these findings have translated to outcomes in clinical trials. We will highlight some gaps in the literature and needs for future research, discuss benefits and limitations of these outcome measures in designing proof-of-concept trials, and offer practical guidelines on design and interpretation when using these fear models for drug discovery.
Application of ideal pressure distribution in development process of automobile seats.
Kilincsoy, U; Wagner, A; Vink, P; Bubb, H
2016-07-19
In designing a car seat the ideal pressure distribution is important as it is the largest contact surface between the human and the car. Because of obstacles hindering a more general application of the ideal pressure distribution in seating design, multidimensional measuring techniques are necessary with extensive user tests. The objective of this study is to apply and integrate the knowledge about the ideal pressure distribution in the seat design process for a car manufacturer in an efficient way. Ideal pressure distribution was combined with pressure measurement, in this case pressure mats. In order to integrate this theoretical knowledge of seating comfort in the seat development process for a car manufacturer a special user interface was defined and developed. The mapping of the measured pressure distribution in real-time and accurately scaled to actual seats during test setups directly lead to design implications for seat design even during the test situation. Detailed analysis of the subject's feedback was correlated with objective measurements of the subject's pressure distribution in real time. Therefore existing seating characteristics were taken into account as well. A user interface can incorporate theoretical and validated 'state of the art' models of comfort. Consequently, this information can reduce extensive testing and lead to more detailed results in a shorter time period.
Experimental validation of thermo-chemical algorithm for a simulation of pultrusion processes
NASA Astrophysics Data System (ADS)
Barkanov, E.; Akishin, P.; Miazza, N. L.; Galvez, S.; Pantelelis, N.
2018-04-01
To provide better understanding of the pultrusion processes without or with temperature control and to support the pultrusion tooling design, an algorithm based on the mixed time integration scheme and nodal control volumes method has been developed. At present study its experimental validation is carried out by the developed cure sensors measuring the electrical resistivity and temperature on the profile surface. By this verification process the set of initial data used for a simulation of the pultrusion process with rod profile has been successfully corrected and finally defined.
Application of CMOS Technology to Silicon Photomultiplier Sensors.
D'Ascenzo, Nicola; Zhang, Xi; Xie, Qingguo
2017-09-25
We use the 180 nm GLOBALFOUNDRIES (GF) BCDLite CMOS process for the production of a silicon photomultiplier prototype. We study the main characteristics of the developed sensor in comparison with commercial SiPMs obtained in custom technologies and other SiPMs developed with CMOS-compatible processes. We support our discussion with a transient modeling of the detection process of the silicon photomultiplier as well as with a series of static and dynamic experimental measurements in dark and illuminated environments.
Online sensing and control of oil in process wastewater
NASA Astrophysics Data System (ADS)
Khomchenko, Irina B.; Soukhomlinoff, Alexander D.; Mitchell, T. F.; Selenow, Alexander E.
2002-02-01
Industrial processes, which eliminate high concentration of oil in their waste stream, find it extremely difficult to measure and control the water purification process. Most oil separation processes involve chemical separation using highly corrosive caustics, acids, surfactants, and emulsifiers. Included in the output of this chemical treatment process are highly adhesive tar-like globules, emulsified and surface oils, and other emulsified chemicals, in addition to suspended solids. The level of oil/hydrocarbons concentration in the wastewater process may fluctuate from 1 ppm to 10,000 ppm, depending upon the specifications of the industry and level of water quality control. The authors have developed a sensing technology, which provides the accuracy of scatter/absorption sensing in a contactless environment by combining these methodologies with reflective measurement. The sensitivity of the sensor may be modified by changing the fluid level control in the flow cell, allowing for a broad range of accurate measurement from 1 ppm to 10,000 ppm. Because this sensing system has been designed to work in a highly invasive environment, it can be placed close to the process source to allow for accurate real time measurement and control.
Infrared thermography of welding zones produced by polymer extrusion additive manufacturing✩
Seppala, Jonathan E.; Migler, Kalman D.
2016-01-01
In common thermoplastic additive manufacturing (AM) processes, a solid polymer filament is melted, extruded though a rastering nozzle, welded onto neighboring layers and solidified. The temperature of the polymer at each of these stages is the key parameter governing these non-equilibrium processes, but due to its strong spatial and temporal variations, it is difficult to measure accurately. Here we utilize infrared (IR) imaging - in conjunction with necessary reflection corrections and calibration procedures - to measure these temperature profiles of a model polymer during 3D printing. From the temperature profiles of the printed layer (road) and sublayers, the temporal profile of the crucially important weld temperatures can be obtained. Under typical printing conditions, the weld temperature decreases at a rate of approximately 100 °C/s and remains above the glass transition temperature for approximately 1 s. These measurement methods are a first step in the development of strategies to control and model the printing processes and in the ability to develop models that correlate critical part strength with material and processing parameters. PMID:29167755
Infrared thermography of welding zones produced by polymer extrusion additive manufacturing.
Seppala, Jonathan E; Migler, Kalman D
2016-10-01
In common thermoplastic additive manufacturing (AM) processes, a solid polymer filament is melted, extruded though a rastering nozzle, welded onto neighboring layers and solidified. The temperature of the polymer at each of these stages is the key parameter governing these non-equilibrium processes, but due to its strong spatial and temporal variations, it is difficult to measure accurately. Here we utilize infrared (IR) imaging - in conjunction with necessary reflection corrections and calibration procedures - to measure these temperature profiles of a model polymer during 3D printing. From the temperature profiles of the printed layer (road) and sublayers, the temporal profile of the crucially important weld temperatures can be obtained. Under typical printing conditions, the weld temperature decreases at a rate of approximately 100 °C/s and remains above the glass transition temperature for approximately 1 s. These measurement methods are a first step in the development of strategies to control and model the printing processes and in the ability to develop models that correlate critical part strength with material and processing parameters.
Diagnostic for Plasma Enhanced Chemical Vapor Deposition and Etch Systems
NASA Technical Reports Server (NTRS)
Cappelli, Mark A.
1999-01-01
In order to meet NASA's requirements for the rapid development and validation of future generation electronic devices as well as associated materials and processes, enabling technologies ion the processing of semiconductor materials arising from understanding etch chemistries are being developed through a research collaboration between Stanford University and NASA-Ames Research Center, Although a great deal of laboratory-scale research has been performed on many of materials processing plasmas, little is known about the gas-phase and surface chemical reactions that are critical in many etch and deposition processes, and how these reactions are influenced by the variation in operating conditions. In addition, many plasma-based processes suffer from stability and reliability problems leading to a compromise in performance and a potentially increased cost for the semiconductor manufacturing industry. Such a lack of understanding has hindered the development of process models that can aid in the scaling and improvement of plasma etch and deposition systems. The research described involves the study of plasmas used in semiconductor processes. An inductively coupled plasma (ICP) source in place of the standard upper electrode assembly of the Gaseous Electronics Conference (GEC) radio-frequency (RF) Reference Cell is used to investigate the discharge characteristics and chemistries. This ICP source generates plasmas with higher electron densities (approximately 10(exp 12)/cu cm) and lower operating pressures (approximately 7 mTorr) than obtainable with the original parallel-plate version of the GEC Cell. This expanded operating regime is more relevant to new generations of industrial plasma systems being used by the microelectronics industry. The motivation for this study is to develop an understanding of the physical phenomena involved in plasma processing and to measure much needed fundamental parameters, such as gas-phase and surface reaction rates. species concentration, temperature, ion energy distribution, and electron number density. A wide variety of diagnostic techniques are under development through this consortium grant to measure these parameters. including molecular beam mass spectrometry (MBMS). Fourier transform infrared (FTIR) spectroscopy, broadband ultraviolet (UV) absorption spectroscopy, a compensated Langmuir probe. Additional diagnostics. Such as microwave interferometry and microwave absorption for measurements of plasma density and radical concentrations are also planned.
The electrostatics of parachutes
NASA Astrophysics Data System (ADS)
Yu, Li; Ming, Xiao
2007-12-01
In the research of parachute, canopy inflation process modeling is one of the most complicated tasks. As canopy often experiences the largest deformations and loadings during a very short time, it is of great difficulty for theoretical analysis and experimental measurements. In this paper, aerodynamic equations and structural dynamics equations were developed for describing parachute opening process, and an iterative coupling solving strategy incorporating the above equations was proposed for a small-scale, flexible and flat-circular parachute. Then, analyses were carried out for canopy geometry, time-dependent pressure difference between the inside and outside of the canopy, transient vortex around the canopy and the flow field in the radial plane as a sequence in opening process. The mechanism of the canopy shape development was explained from perspective of transient flow fields during the inflation process. Experiments of the parachute opening process were conducted in a wind tunnel, in which instantaneous shape of the canopy was measured by high velocity camera and the opening loading was measured by dynamometer balance. The theoretical predictions were found in good agreement with the experimental results, validating the proposed approach. This numerical method can improve the situation of strong dependence of parachute research on wind tunnel tests, and is of significance to the understanding of the mechanics of parachute inflation process.
Design of experiments enhanced statistical process control for wind tunnel check standard testing
NASA Astrophysics Data System (ADS)
Phillips, Ben D.
The current wind tunnel check standard testing program at NASA Langley Research Center is focused on increasing data quality, uncertainty quantification and overall control and improvement of wind tunnel measurement processes. The statistical process control (SPC) methodology employed in the check standard testing program allows for the tracking of variations in measurements over time as well as an overall assessment of facility health. While the SPC approach can and does provide researchers with valuable information, it has certain limitations in the areas of process improvement and uncertainty quantification. It is thought by utilizing design of experiments methodology in conjunction with the current SPC practices that one can efficiently and more robustly characterize uncertainties and develop enhanced process improvement procedures. In this research, methodologies were developed to generate regression models for wind tunnel calibration coefficients, balance force coefficients and wind tunnel flow angularities. The coefficients of these regression models were then tracked in statistical process control charts, giving a higher level of understanding of the processes. The methodology outlined is sufficiently generic such that this research can be applicable to any wind tunnel check standard testing program.
NASA Astrophysics Data System (ADS)
Lyu, H.; Ni, G.; Sun, T.
2016-12-01
Urban stormwater management contributes to recover water cycle to a nearly natural situation. It is a challenge for analyzing the hydrologic performance in a watershed scale, since the measures are various of sorts and scales and work in different processes. A three processes framework is developed to simplify the urban hydrologic process on the surface and evaluate the urban stormwater management. The three processes include source utilization, transfer regulation and terminal detention, by which the stormwater is controlled in order or discharged. Methods for analyzing performance are based on the water controlled proportions by each process, which are calculated using USEPA Stormwater Management Model. A case study form Beijing is used to illustrate how the performance varies under a set of designed events of different return periods. This framework provides a method to assess urban stormwater management as a whole system considering the interaction between measures, and to examine if there is any weak process of an urban watershed to be improved. The results help to make better solutions of urban water crisis.
Students' Ability to Solve Process-Diagram Problems in Secondary Biology Education
ERIC Educational Resources Information Center
Kragten, Marco; Admiraal, Wilfried; Rijlaarsdam, Gert
2015-01-01
Process diagrams are important tools in biology for explaining processes such as protein synthesis, compound cycles and the like. The aim of the present study was to measure the ability to solve process-diagram problems in biology and its relationship with prior knowledge, spatial ability and working memory. For this purpose, we developed a test…
Safe procedure development to manage hazardous drugs in the workplace.
Gaspar Carreño, Marisa; Achau Muñoz, Rubén; Torrico Martín, Fátima; Agún Gonzalez, Juan José; Sanchez Santos, Jose Cristobal; Cercos Lletí, Ana Cristina; Ramos Orozco, Pedro
2017-03-01
To develop a safety working procedure for the employees in the Intermutual Hospital de Levante (HIL) in those areas of activity that deal with the handling of hazardous drugs (MP). The procedure was developed in six phases: 1) hazard definition; 2) definition and identification of processes and development of general correct work practices about hazardous drugs' selection and special handling; 3) detection, selection and set of specific recommendations to handle with hazardous drugs during the processes of preparation and administration included in the hospital GFT; 4) categorization of risk during the preparation/administration and development of an identification system; 5) information and training of professionals; 6) implementation of the identification measures and prevention guidelines. Six processes were detected handling HD. During those processes, thirty HD were identified included in the hospital GFT and a safer alternative was found for 6 of them. The HD were classified into 4 risk categories based on those measures to be taken during the preparation and administration of each of them. The development and implementation of specific safety-work processes dealing with medication handling, allows hospital managers to accomplish effectively with their legal obligations about the area of prevention and provides healthcare professional staff with the adequate techniques and safety equipment to avoid possible dangers and risks of some drugs. Copyright AULA MEDICA EDICIONES 2014. Published by AULA MEDICA. All rights reserved.
Quantifying spatial differences in metabolism in headwater streams
Ricardo González-Pinzón; Roy Haggerty; Alba Argerich
2014-01-01
Stream functioning includes simultaneous interaction among solute transport, nutrient processing, and metabolism. Metabolism is measured with methods that have limited spatial representativeness and are highly uncertain. These problems restrict development of methods for up-scaling biological processes that mediate nutrient processing. We used the resazurinâresorufin (...
Innovative model of business process reengineering at machine building enterprises
NASA Astrophysics Data System (ADS)
Nekrasov, R. Yu; Tempel, Yu A.; Tempel, O. A.
2017-10-01
The paper provides consideration of business process reengineering viewed as amanagerial innovation accepted by present day machine building enterprises, as well as waysto improve its procedure. A developed innovative model of reengineering measures isdescribed and is based on the process approach and other principles of company management.
Design and validation of instruments to measure knowledge.
Elliott, T E; Regal, R R; Elliott, B A; Renier, C M
2001-01-01
Measuring health care providers' learning after they have participated in educational interventions that use experimental designs requires valid, reliable, and practical instruments. A literature review was conducted. In addition, experience gained from designing and validating instruments for measuring the effect of an educational intervention informed this process. The eight main steps for designing, validating, and testing the reliability of instruments for measuring learning outcomes are presented. The key considerations and rationale for this process are discussed. Methods for critiquing and adapting existent instruments and creating new ones are offered. This study may help other investigators in developing valid, reliable, and practical instruments for measuring the outcomes of educational activities.
NASA Technical Reports Server (NTRS)
Choi, H. J.; Su, Y. T.
1986-01-01
The User Constraint Measurement System (UCMS) is a hardware/software package developed by NASA Goddard to measure the signal parameter constraints of the user transponder in the TDRSS environment by means of an all-digital signal sampling technique. An account is presently given of the features of UCMS design and of its performance capabilities and applications; attention is given to such important aspects of the system as RF interface parameter definitions, hardware minimization, the emphasis on offline software signal processing, and end-to-end link performance. Applications to the measurement of other signal parameters are also discussed.
Measuring the diffusion of innovative health promotion programs.
Steckler, A; Goodman, R M; McLeroy, K R; Davis, S; Koch, G
1992-01-01
Once a health promotion program has proven to be effective in one or two initial settings, attempts may be made to transfer the program to new settings. One way to conceptualize the transference of health promotion programs from one locale to another is by considering the programs to be innovations that are being diffused. In this way, diffusion of innovation theory can be applied to guide the process of program transference. This article reports on the development of six questionnaires to measure the extent to which health promotion programs are successfully disseminated: Organizational Climate, Awareness-Concern, Rogers's Adoption Variables, Level of Use, Level of Success, and Level of Institutionalization. The instruments are being successfully used in a study of the diffusion of health promotion/tobacco prevention curricula to junior high schools in North Carolina. The instruments, which measure the four steps of the diffusion process, have construct validity since they were developed within existing theories and are derived from the work of previous researchers. No previous research has attempted to use instruments like these to measure sequentially the stages of the diffusion process.
Sevenster, Merlijn; Bozeman, Jeffrey; Cowhy, Andrea; Trost, William
2015-02-01
To standardize and objectivize treatment response assessment in oncology, guidelines have been proposed that are driven by radiological measurements, which are typically communicated in free-text reports defying automated processing. We study through inter-annotator agreement and natural language processing (NLP) algorithm development the task of pairing measurements that quantify the same finding across consecutive radiology reports, such that each measurement is paired with at most one other ("partial uniqueness"). Ground truth is created based on 283 abdomen and 311 chest CT reports of 50 patients each. A pre-processing engine segments reports and extracts measurements. Thirteen features are developed based on volumetric similarity between measurements, semantic similarity between their respective narrative contexts and structural properties of their report positions. A Random Forest classifier (RF) integrates all features. A "mutual best match" (MBM) post-processor ensures partial uniqueness. In an end-to-end evaluation, RF has precision 0.841, recall 0.807, F-measure 0.824 and AUC 0.971; with MBM, which performs above chance level (P<0.001), it has precision 0.899, recall 0.776, F-measure 0.833 and AUC 0.935. RF (RF+MBM) has error-free performance on 52.7% (57.4%) of report pairs. Inter-annotator agreement of three domain specialists with the ground truth (κ>0.960) indicates that the task is well defined. Domain properties and inter-section differences are discussed to explain superior performance in abdomen. Enforcing partial uniqueness has mixed but minor effects on performance. A combined machine learning-filtering approach is proposed for pairing measurements, which can support prospective (supporting treatment response assessment) and retrospective purposes (data mining). Copyright © 2014 Elsevier Inc. All rights reserved.
Performance Measurement using KPKU- BUMN in X School Education Foundation
NASA Astrophysics Data System (ADS)
Arijanto, Sugih; Harsono, Ambar; Taroepratjeka, Harsono
2016-01-01
The purpose of this research is to determine X School's Strengths and Opportunity of Improvement through performance measurement using KPKU-BUMN (Kriteria Penilaian Kinerja Unggul - Kementerian Badan Usaha Milik Negara). KPKU-BUMN is developed based on Malcolm Baldrige Criteria for Performance Excellent (MBCfPE). X school is an education foundation at Bandung that has provides education from kindergarten, elementary school, to junior and senior high school. The measurement is implemented by two aspects, Process and Result. The Process is measured by A-D-L-I approaches (Approach- Deployment-Learning- Integration), on the other hand The Result is measured by Le-T-C-I approach (Level-Trend- Comparison-Integration). There are six processes that will be measured: (1) Leadership, (2) Strategic Planning, (3) Customer Focus, (4) Measurement, Analysis and Knowledge Management, (5) Work Force Focus, and (6) Operation Focus. Meanwhile, the result are (a) product & process outcomes, (b) customer-focused outcomes, (c) workforce-focused outcomes, (d) leadership & governance outcomes, and (e) financial & market outcomes. The overall score for X School is 284/1000, which means X School is at “early result” level at “poor” global image.
Stretchable electronics for wearable and high-current applications
NASA Astrophysics Data System (ADS)
Hilbich, Daniel; Shannon, Lesley; Gray, Bonnie L.
2016-04-01
Advances in the development of novel materials and fabrication processes are resulting in an increased number of flexible and stretchable electronics applications. This evolving technology enables new devices that are not readily fabricated using traditional silicon processes, and has the potential to transform many industries, including personalized healthcare, consumer electronics, and communication. Fabrication of stretchable devices is typically achieved through the use of stretchable polymer-based conductors, or more rigid conductors, such as metals, with patterned geometries that can accommodate stretching. Although the application space for stretchable electronics is extensive, the practicality of these devices can be severely limited by power consumption and cost. Moreover, strict process flows can impede innovation that would otherwise enable new applications. In an effort to overcome these impediments, we present two modified approaches and applications based on a newly developed process for stretchable and flexible electronics fabrication. This includes the development of a metallization pattern stamping process allowing for 1) stretchable interconnects to be directly integrated with stretchable/wearable fabrics, and 2) a process variation enabling aligned multi-layer devices with integrated ferromagnetic nanocomposite polymer components enabling a fully-flexible electromagnetic microactuator for large-magnitude magnetic field generation. The wearable interconnects are measured, showing high conductivity, and can accommodate over 20% strain before experiencing conductive failure. The electromagnetic actuators have been fabricated and initial measurements show well-aligned, highly conductive, isolated metal layers. These two applications demonstrate the versatility of the newly developed process and suggest potential for its furthered use in stretchable electronics and MEMS applications.
National Security Technology Incubator Evaluation Process
DOE Office of Scientific and Technical Information (OSTI.GOV)
None, None
This report describes the process by which the National Security Technology Incubator (NSTI) will be evaluated. The technology incubator is being developed as part of the National Security Preparedness Project (NSPP), funded by a Department of Energy (DOE)/National Nuclear Security Administration (NNSA) grant. This report includes a brief description of the components, steps, and measures of the proposed evaluation process. The purpose of the NSPP is to promote national security technologies through business incubation, technology demonstration and validation, and workforce development. The NSTI will focus on serving businesses with national security technology applications by nurturing them through critical stages ofmore » early development. An effective evaluation process of the NSTI is an important step as it can provide qualitative and quantitative information on incubator performance over a given period. The vision of the NSTI is to be a successful incubator of technologies and private enterprise that assist the NNSA in meeting new challenges in national safety and security. The mission of the NSTI is to identify, incubate, and accelerate technologies with national security applications at various stages of development by providing hands-on mentoring and business assistance to small businesses and emerging or growing companies. To achieve success for both incubator businesses and the NSTI program, an evaluation process is essential to effectively measure results and implement corrective processes in the incubation design if needed. The evaluation process design will collect and analyze qualitative and quantitative data through performance evaluation system.« less
Simultaneous measurements of Cotton fiber maturity, fineness, ribbon width, and micronaire
USDA-ARS?s Scientific Manuscript database
Maturity (degree of secondary wall development) and fineness (linear density) are important cotton quality and processing properties, but their direct measurement is often difficult and/or expensive to perform. An indirect but critical measurement of maturity and fineness is micronaire, which is on...
NASA Technical Reports Server (NTRS)
Sekula, Martin K.
2012-01-01
Projection moir interferometry (PMI) was employed to measure blade deflections during a hover test of a generic model-scale rotor in the NASA Langley 14x22 subsonic wind tunnel s hover facility. PMI was one of several optical measurement techniques tasked to acquire deflection and flow visualization data for a rotor at several distinct heights above a ground plane. Two of the main objectives of this test were to demonstrate that multiple optical measurement techniques can be used simultaneously to acquire data and to identify and address deficiencies in the techniques. Several PMI-specific technical challenges needed to be addressed during the test and in post-processing of the data. These challenges included developing an efficient and accurate calibration method for an extremely large (65 inch) height range; automating the analysis of the large amount of data acquired during the test; and developing a method to determinate the absolute displacement of rotor blades without a required anchor point measurement. The results indicate that the use of a single-camera/single-projector approach for the large height range reduced the accuracy of the PMI system compared to PMI systems designed for smaller height ranges. The lack of the anchor point measurement (due to a technical issue with one of the other measurement techniques) limited the ability of the PMI system to correctly measure blade displacements to only one of the three rotor heights tested. The new calibration technique reduced the data required by 80 percent while new post-processing algorithms successfully automated the process of locating rotor blades in images, determining the blade quarter chord location, and calculating the blade root and blade tip heights above the ground plane.
UAVSAR: Airborne L-band Radar for Repeat Pass Interferometry
NASA Technical Reports Server (NTRS)
Moes, Timothy R.
2009-01-01
The primary objectives of the UAVSAR Project were to: a) develop a miniaturized polarimetric L-band synthetic aperture radar (SAR) for use on an unmanned aerial vehicle (UAV) or piloted vehicle. b) develop the associated processing algorithms for repeat-pass differential interferometric measurements using a single antenna. c) conduct measurements of geophysical interest, particularly changes of rapidly deforming surfaces such as volcanoes or earthquakes. Two complete systems were developed. Operational Science Missions began on February 18, 2009 ... concurrent development and testing of the radar system continues.
Multi-interface Level Sensors and New Development in Monitoring and Control of Oil Separators
Bukhari, Syed Faisal Ahmed; Yang, Wuqiang
2006-01-01
In the oil industry, huge saving may be made if suitable multi-interface level measurement systems are employed for effectively monitoring crude oil separators and efficient control of their operation. A number of techniques, e.g. externally mounted displacers, differential pressure transmitters and capacitance rod devices, have been developed to measure the separation process with gas, oil, water and other components. Because of the unavailability of suitable multi-interface level measurement systems, oil separators are currently operated by the trial-and-error approach. In this paper some conventional techniques, which have been used for level measurement in industry, and new development are discussed.
ERIC Educational Resources Information Center
Simpson, Genevieve; Clifton, Julian
2016-01-01
Peer review feedback, developed to assist students with increasing the quality of group reports and developing peer review skills, was added to a master's level Climate Change Policy and Planning unit. A pre- and post-survey was conducted to determine whether students found the process a valuable learning opportunity: 87% of students responding to…
ERIC Educational Resources Information Center
Fernald, Anne; Marchman, Virginia A.
2012-01-01
Using online measures of familiar word recognition in the looking-while-listening procedure, this prospective longitudinal study revealed robust links between processing efficiency and vocabulary growth from 18 to 30 months in children classified as typically developing (n = 46) and as "late talkers" (n = 36) at 18 months. Those late talkers who…
ERIC Educational Resources Information Center
Jakku-Sihvonen, Ritva; Tissari, Varpu; Ots, Aivar; Uusiautti, Satu
2012-01-01
During the Bologna process, from 2003 to 2006, degree programmes, including teacher education curricula, were developed in line with the two-tier system--the European Credit Transfer and Accumulation System (ECTS) and modularization. The purpose of the present study is to contribute to the development of teacher education profiling measures by…
Monitoring non-thermal plasma processes for nanoparticle synthesis
NASA Astrophysics Data System (ADS)
Mangolini, Lorenzo
2017-09-01
Process characterization tools have played a crucial role in the investigation of dusty plasmas. The presence of dust in certain non-thermal plasma processes was first detected by laser light scattering measurements. Techniques like laser induced particle explosive evaporation and ion mass spectrometry have provided the experimental evidence necessary for the development of the theory of particle nucleation in silane-containing non-thermal plasmas. This review provides first a summary of these early efforts, and then discusses recent investigations using in situ characterization techniques to understand the interaction between nanoparticles and plasmas. The advancement of such monitoring techniques is necessary to fully develop the potential of non-thermal plasmas as unique materials synthesis and processing platforms. At the same time, the strong coupling between materials and plasma properties suggest that it is also necessary to advance techniques for the measurement of plasma properties while in presence of dust. Recent progress in this area will be discussed.
Mechanical impedance measurements for improved cost-effective process monitoring
NASA Astrophysics Data System (ADS)
Clopet, Caroline R.; Pullen, Deborah A.; Badcock, Rodney A.; Ralph, Brian; Fernando, Gerard F.; Mahon, Steve W.
1999-06-01
The aerospace industry has seen a considerably growth in composite usage over the past ten years, especially with the development of cost effective manufacturing techniques such as Resin Transfer Molding and Resin Infusion under Flexible Tooling. The relatively high cost of raw material and conservative processing schedules has limited their growth further in non-aerospace technologies. In-situ process monitoring has been explored for some time as a means to improving the cost efficiency of manufacturing with dielectric spectroscopy and optical fiber sensors being the two primary techniques developed to date. A new emerging technique is discussed here making use of piezoelectric wafers with the ability to sense not only aspects of resin flow but also to detect the change in properties of the resin as it cures. Experimental investigations to date have shown a correlation between mechanical impedance measurements and the mechanical properties of cured epoxy systems with potential for full process monitoring.
Davis, Ben; Grosvenor, Chriss; Johnk, Robert; Novotny, David; Baker-Jarvis, James; Janezic, Michael
2007-01-01
Building materials are often incorporated into complex, multilayer macrostructures that are simply not amenable to measurements using coax or waveguide sample holders. In response to this, we developed an ultra-wideband (UWB) free-field measurement system. This measurement system uses a ground-plane-based system and two TEM half-horn antennas to transmit and receive the RF signal. The material samples are placed between the antennas, and reflection and transmission measurements made. Digital signal processing techniques are then applied to minimize environmental and systematic effects. The processed data are compared to a plane-wave model to extract the material properties with optimization software based on genetic algorithms.
Consistent and efficient processing of ADCP streamflow measurements
Mueller, David S.; Constantinescu, George; Garcia, Marcelo H.; Hanes, Dan
2016-01-01
The use of Acoustic Doppler Current Profilers (ADCPs) from a moving boat is a commonly used method for measuring streamflow. Currently, the algorithms used to compute the average depth, compute edge discharge, identify invalid data, and estimate velocity and discharge for invalid data vary among manufacturers. These differences could result in different discharges being computed from identical data. Consistent computational algorithm, automated filtering, and quality assessment of ADCP streamflow measurements that are independent of the ADCP manufacturer are being developed in a software program that can process ADCP moving-boat discharge measurements independent of the ADCP used to collect the data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Isselhardt, B. H.; Prussin, S. G.; Savina, M. R.
2016-01-01
Resonance Ionization Mass Spectrometry (RIMS) has been developed as a method to measure uranium isotope abundances. In this approach, RIMS is used as an element-selective ionization process between uranium atoms and potential isobars without the aid of chemical purification and separation. The use of broad bandwidth lasers with automated feedback control of wavelength was applied to the measurement of the U-235/U-238 ratio to decrease laser-induced isotopic fractionation. In application, isotope standards are used to identify and correct bias in measured isotope ratios, but understanding laser-induced bias from first-principles can improve the precision and accuracy of experimental measurements. A rate equationmore » model for predicting the relative ionization probability has been developed to study the effect of variations in laser parameters on the measured isotope ratio. The model uses atomic data and empirical descriptions of laser performance to estimate the laser-induced bias expected in experimental measurements of the U-235/U-238 ratio. Empirical corrections are also included to account for ionization processes that are difficult to calculate from first principles with the available atomic data. Development of this model has highlighted several important considerations for properly interpreting experimental results.« less
Isselhardt, B. H.; Prussin, S. G.; Savina, M. R.; ...
2015-12-07
Resonance Ionization Mass Spectrometry (RIMS) has been developed as a method to measure uranium isotope abundances. In this approach, RIMS is used as an element-selective ionization process between uranium atoms and potential isobars without the aid of chemical purification and separation. The use of broad bandwidth lasers with automated feedback control of wavelength was applied to the measurement of the 235U/238U ratio to decrease laser-induced isotopic fractionation. In application, isotope standards are used to identify and correct bias in measured isotope ratios, but understanding laser-induced bias from first-principles can improve the precision and accuracy of experimental measurements. A rate equationmore » model for predicting the relative ionization probability has been developed to study the effect of variations in laser parameters on the measured isotope ratio. The model uses atomic data and empirical descriptions of laser performance to estimate the laser-induced bias expected in experimental measurements of the 235U/ 238U ratio. Empirical corrections are also included to account for ionization processes that are difficult to calculate from first principles with the available atomic data. As a result, development of this model has highlighted several important considerations for properly interpreting experimental results.« less
Gaguski, Michele; George, Kim; Bruce, Susan; Brucker, Edie; Leija, Carol; LeFebvre, Kristine; Thompson Mackey, Heather
2017-09-25
A project team was formulated by the Oncology Nursing Society (ONS) to create evidence-based oncology nurse generalist (ONG) competencies to establish best practices in competency development, including high-risk tasks, critical thinking criteria, and measurement of key areas for oncology nurses. This article aims to describe the process and the development of ONG competencies. This article describes how the ONG competencies were accomplished, and includes outcomes and suggestions for use in clinical practice. Institutions can use the ONG competencies to assess and develop competency programs, offer unique educational strategies to measure and appraise proficiency, and establish processes to foster a workplace environment committed to mentoring and teaching future oncology nurses. 2017 Oncology Nursing Society
Development of a measure of work motivation for a meta-theory of motivation.
Ryan, James C
2011-06-01
This study presents a measure of work motivation designed to assess the motivational concepts of the meta-theory of motivation. These concepts include intrinsic process motivation, goal internalization motivation, instrumental motivation, external self-concept motivation, and internal self-concept motivation. Following a process of statement development and identification, six statements for each concept were presented to a sample of working professionals (N = 330) via a paper-and-pencil questionnaire. Parallel analysis supported a 5-factor solution, with a varimax rotation identifying 5 factors accounting for 48.9% of total variance. All 5 scales had Cronbach alpha coefficients above .70. Limitations of the newly proposed questionnaire and suggestions for its further development and use are discussed.
Model-Based Infrared Metrology for Advanced Technology Nodes and 300 mm Wafer Processing
NASA Astrophysics Data System (ADS)
Rosenthal, Peter A.; Duran, Carlos; Tower, Josh; Mazurenko, Alex; Mantz, Ulrich; Weidner, Peter; Kasic, Alexander
2005-09-01
The use of infrared spectroscopy for production semiconductor process monitoring has evolved recently from primarily unpatterned, i.e. blanket test wafer measurements in a limited historical application space of blanket epitaxial, BPSG, and FSG layers to new applications involving patterned product wafer measurements, and new measurement capabilities. Over the last several years, the semiconductor industry has adopted a new set of materials associated with copper/low-k interconnects, and new structures incorporating exotic materials including silicon germanium, SOI substrates and high aspect ratio trenches. The new device architectures and more chemically sophisticated materials have raised new process control and metrology challenges that are not addressed by current measurement technology. To address the challenges we have developed a new infrared metrology tool designed for emerging semiconductor production processes, in a package compatible with modern production and R&D environments. The tool incorporates recent advances in reflectance instrumentation including highly accurate signal processing, optimized reflectometry optics, and model-based calibration and analysis algorithms. To meet the production requirements of the modern automated fab, the measurement hardware has been integrated with a fully automated 300 mm platform incorporating front opening unified pod (FOUP) interfaces, automated pattern recognition and high throughput ultra clean robotics. The tool employs a suite of automated dispersion-model analysis algorithms capable of extracting a variety of layer properties from measured spectra. The new tool provides excellent measurement precision, tool matching, and a platform for deploying many new production and development applications. In this paper we will explore the use of model based infrared analysis as a tool for characterizing novel bottle capacitor structures employed in high density dynamic random access memory (DRAM) chips. We will explore the capability of the tool for characterizing multiple geometric parameters associated with the manufacturing process that are important to the yield and performance of advanced bottle DRAM devices.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Isselhardt, Brett H.
2011-09-01
Resonance Ionization Mass Spectrometry (RIMS) has been developed as a method to measure relative uranium isotope abundances. In this approach, RIMS is used as an element-selective ionization process to provide a distinction between uranium atoms and potential isobars without the aid of chemical purification and separation. We explore the laser parameters critical to the ionization process and their effects on the measured isotope ratio. Specifically, the use of broad bandwidth lasers with automated feedback control of wavelength was applied to the measurement of 235U/ 238U ratios to decrease laser-induced isotopic fractionation. By broadening the bandwidth of the first laser inmore » a 3-color, 3-photon ionization process from a bandwidth of 1.8 GHz to about 10 GHz, the variation in sequential relative isotope abundance measurements decreased from >10% to less than 0.5%. This procedure was demonstrated for the direct interrogation of uranium oxide targets with essentially no sample preparation. A rate equation model for predicting the relative ionization probability has been developed to study the effect of variation in laser parameters on the measured isotope ratio. This work demonstrates that RIMS can be used for the robust measurement of uranium isotope ratios.« less
Model for Simulating a Spiral Software-Development Process
NASA Technical Reports Server (NTRS)
Mizell, Carolyn; Curley, Charles; Nayak, Umanath
2010-01-01
A discrete-event simulation model, and a computer program that implements the model, have been developed as means of analyzing a spiral software-development process. This model can be tailored to specific development environments for use by software project managers in making quantitative cases for deciding among different software-development processes, courses of action, and cost estimates. A spiral process can be contrasted with a waterfall process, which is a traditional process that consists of a sequence of activities that include analysis of requirements, design, coding, testing, and support. A spiral process is an iterative process that can be regarded as a repeating modified waterfall process. Each iteration includes assessment of risk, analysis of requirements, design, coding, testing, delivery, and evaluation. A key difference between a spiral and a waterfall process is that a spiral process can accommodate changes in requirements at each iteration, whereas in a waterfall process, requirements are considered to be fixed from the beginning and, therefore, a waterfall process is not flexible enough for some projects, especially those in which requirements are not known at the beginning or may change during development. For a given project, a spiral process may cost more and take more time than does a waterfall process, but may better satisfy a customer's expectations and needs. Models for simulating various waterfall processes have been developed previously, but until now, there have been no models for simulating spiral processes. The present spiral-process-simulating model and the software that implements it were developed by extending a discrete-event simulation process model of the IEEE 12207 Software Development Process, which was built using commercially available software known as the Process Analysis Tradeoff Tool (PATT). Typical inputs to PATT models include industry-average values of product size (expressed as number of lines of code), productivity (number of lines of code per hour), and number of defects per source line of code. The user provides the number of resources, the overall percent of effort that should be allocated to each process step, and the number of desired staff members for each step. The output of PATT includes the size of the product, a measure of effort, a measure of rework effort, the duration of the entire process, and the numbers of injected, detected, and corrected defects as well as a number of other interesting features. In the development of the present model, steps were added to the IEEE 12207 waterfall process, and this model and its implementing software were made to run repeatedly through the sequence of steps, each repetition representing an iteration in a spiral process. Because the IEEE 12207 model is founded on a waterfall paradigm, it enables direct comparison of spiral and waterfall processes. The model can be used throughout a software-development project to analyze the project as more information becomes available. For instance, data from early iterations can be used as inputs to the model, and the model can be used to estimate the time and cost of carrying the project to completion.
Grazing Incidence Wavefront Sensing and Verification of X-Ray Optics Performance
NASA Technical Reports Server (NTRS)
Saha, Timo T.; Rohrbach, Scott; Zhang, William W.
2011-01-01
Evaluation of interferometrically measured mirror metrology data and characterization of a telescope wavefront can be powerful tools in understanding of image characteristics of an x-ray optical system. In the development of soft x-ray telescope for the International X-Ray Observatory (IXO), we have developed new approaches to support the telescope development process. Interferometrically measuring the optical components over all relevant spatial frequencies can be used to evaluate and predict the performance of an x-ray telescope. Typically, the mirrors are measured using a mount that minimizes the mount and gravity induced errors. In the assembly and mounting process the shape of the mirror segments can dramatically change. We have developed wavefront sensing techniques suitable for the x-ray optical components to aid us in the characterization and evaluation of these changes. Hartmann sensing of a telescope and its components is a simple method that can be used to evaluate low order mirror surface errors and alignment errors. Phase retrieval techniques can also be used to assess and estimate the low order axial errors of the primary and secondary mirror segments. In this paper we describe the mathematical foundation of our Hartmann and phase retrieval sensing techniques. We show how these techniques can be used in the evaluation and performance prediction process of x-ray telescopes.
Development and Evaluation of Vocational Competency Measures. Final Report.
ERIC Educational Resources Information Center
Chalupsky, Albert B.; And Others
A series of occupational competency tests representing all seven vocational education curriculum areas were developed, field tested, and validated. Seventeen occupations were selected for competency test development: agricultural chemicals applications technician, farm equipment mechanic, computer operator, word processing specialist, apparel…
Development of the Sexual Minority Adolescent Stress Inventory
Schrager, Sheree M.; Goldbach, Jeremy T.; Mamey, Mary Rose
2018-01-01
Although construct measurement is critical to explanatory research and intervention efforts, rigorous measure development remains a notable challenge. For example, though the primary theoretical model for understanding health disparities among sexual minority (e.g., lesbian, gay, bisexual) adolescents is minority stress theory, nearly all published studies of this population rely on minority stress measures with poor psychometric properties and development procedures. In response, we developed the Sexual Minority Adolescent Stress Inventory (SMASI) with N = 346 diverse adolescents ages 14–17, using a comprehensive approach to de novo measure development designed to produce a measure with desirable psychometric properties. After exploratory factor analysis on 102 candidate items informed by a modified Delphi process, we applied item response theory techniques to the remaining 72 items. Discrimination and difficulty parameters and item characteristic curves were estimated overall, within each of 12 initially derived factors, and across demographic subgroups. Two items were removed for excessive discrimination and three were removed following reliability analysis. The measure demonstrated configural and scalar invariance for gender and age; a three-item factor was excluded for demonstrating substantial differences by sexual identity and race/ethnicity. The final 64-item measure comprised 11 subscales and demonstrated excellent overall (α = 0.98), subscale (α range 0.75–0.96), and test–retest (scale r > 0.99; subscale r range 0.89–0.99) reliabilities. Subscales represented a mix of proximal and distal stressors, including domains of internalized homonegativity, identity management, intersectionality, and negative expectancies (proximal) and social marginalization, family rejection, homonegative climate, homonegative communication, negative disclosure experiences, religion, and work domains (distal). Thus, the SMASI development process illustrates a method to incorporate information from multiple sources, including item response theory models, to guide item selection in building a psychometrically sound measure. We posit that similar methods can be used to improve construct measurement across all areas of psychological research, particularly in areas where a strong theoretical framework exists but existing measures are limited. PMID:29599737
The Use of Thermal Remote Sensing to Study Thermodynamics of Ecosystem Development
NASA Technical Reports Server (NTRS)
Luvall, Jeffrey C.; Rickman, Doug L.; Arnold, James E. (Technical Monitor)
2000-01-01
Thermal remote sensing can provide environmental measuring tools with capabilities for measuring ecosystem development and integrity. Recent advances in applying principles of nonequilibrium thermodynamics to ecology provide fundamental insights into energy partitioning in ecosystems. Ecosystems are nonequilibrium systems, open to material and energy flows, which grow and develop structures and processes to increase energy degradation. More developed terrestrial ecosystems will be more effective at dissipating the solar gradient (degrading its energy content). This can be measured by the effective surface temperature of the ecosystem on a landscape scale. A series of airborne thermal infrared multispectral scanner data were collected from several forested ecosystems ranging from a western US douglas-fir forest to a tropical rain forest in Costa Rica. These data were used to develop measures of ecosystem development and integrity based on surface temperature.
Model-Based Reasoning in Upper-division Lab Courses
NASA Astrophysics Data System (ADS)
Lewandowski, Heather
2015-05-01
Modeling, which includes developing, testing, and refining models, is a central activity in physics. Well-known examples from AMO physics include everything from the Bohr model of the hydrogen atom to the Bose-Hubbard model of interacting bosons in a lattice. Modeling, while typically considered a theoretical activity, is most fully represented in the laboratory where measurements of real phenomena intersect with theoretical models, leading to refinement of models and experimental apparatus. However, experimental physicists use models in complex ways and the process is often not made explicit in physics laboratory courses. We have developed a framework to describe the modeling process in physics laboratory activities. The framework attempts to abstract and simplify the complex modeling process undertaken by expert experimentalists. The framework can be applied to understand typical processes such the modeling of the measurement tools, modeling ``black boxes,'' and signal processing. We demonstrate that the framework captures several important features of model-based reasoning in a way that can reveal common student difficulties in the lab and guide the development of curricula that emphasize modeling in the laboratory. We also use the framework to examine troubleshooting in the lab and guide students to effective methods and strategies.
Modeling of fugitive dust emission for construction sand and gravel processing plant.
Lee, C H; Tang, L W; Chang, C T
2001-05-15
Due to rapid economic development in Taiwan, a large quantity of construction sand and gravel is needed to support domestic civil construction projects. However, a construction sand and gravel processing plant is often a major source of air pollution, due to its associated fugitive dust emission. To predict the amount of fugitive dust emitted from this kind of processing plant, a semiempirical model was developed in this study. This model was developed on the basis of the actual dust emission data (i.e., total suspended particulate, TSP) and four on-site operating parameters (i.e., wind speed (u), soil moisture (M), soil silt content (s), and number (N) of trucks) measured at a construction sand and gravel processing plant. On the basis of the on-site measured data and an SAS nonlinear regression program, the expression of this model is E = 0.011.u2.653.M-1.875.s0.060.N0.896, where E is the amount (kg/ton) of dust emitted during the production of each ton of gravel and sand. This model can serve as a facile tool for predicting the fugitive dust emission from a construction sand and gravel processing plant.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang Zhili; Shneider, Mikhail N.
2010-03-15
This paper presents the experimental measurement and computational model of sodium plasma decay processes in mixture of sodium and argon by using radar resonance-enhanced multiphoton ionization (REMPI), coherent microwave Rayleigh scattering of REMPI. A single laser beam resonantly ionizes the sodium atoms by means of 2+1 REMPI process. The laser beam can only generate the ionization of the sodium atoms and have negligible ionization of argon. Coherent microwave scattering in situ measures the total electron number in the laser-induced plasma. Since the sodium ions decay by recombination with electrons, microwave scattering directly measures the plasma decay processes of the sodiummore » ions. A theoretical plasma dynamic model, including REMPI of the sodium and electron avalanche ionization (EAI) of sodium and argon in the gas mixture, has been developed. It confirms that the EAI of argon is several orders of magnitude lower than the REMPI of sodium. The theoretical prediction made for the plasma decay process of sodium plasma in the mixture matches the experimental measurement.« less
A practical approach to competency assessment.
Claflin, N
1997-01-01
Assessing clinical performance is difficult. Members of the Nursing Service Clinical Practice Committee at the Carl T. Hayden Veterans Affairs Medical Center in Phoenix developed a comprehensive program of competency assessment based on performance measures. This article describes the committee's process of developing and implementing the program and includes a blueprint for competency assessment and selected performance measures for all nursing staff who provide patient care. The approach to competency assessment includes performance measures specific to patients' ages.
System For Characterizing Three-Phase Brushless dc Motors
NASA Technical Reports Server (NTRS)
Howard, David E.; Smith, Dennis A.
1996-01-01
System of electronic hardware and software developed to automate measurements and calculations needed to characterize electromechanical performances of three-phase brushless dc motors, associated shaft-angle sensors needed for commutation, and associated brushless tachometers. System quickly takes measurements on all three phases of motor, tachometer, and shaft-angle sensor simultaneously and processes measurements into performance data. Also useful in development and testing of motors with not only three phases but also two, four, or more phases.
Sense of competence in dementia care staff (SCIDS) scale: development, reliability, and validity.
Schepers, Astrid Kristine; Orrell, Martin; Shanahan, Niamh; Spector, Aimee
2012-07-01
Sense of competence in dementia care staff (SCIDS) may be associated with more positive attitudes to dementia among care staff and better outcomes for those being cared for. There is a need for a reliable and valid measure of sense of competence specific to dementia care staff. This study describes the development and evaluation of a measure to assess "sense of competence" in dementia care staff and reports on its psychometric properties. The systematic measure development process involved care staff and experts. For item selection and assessment of psychometric properties, a pilot study (N = 37) and a large-scale study (N = 211) with a test-retest reliability (N = 58) sub-study were undertaken. The final measure consists of 17 items across four subscales with acceptable to good internal consistency and moderate to substantial test-retest reliability. As predicted, the measure was positively associated with work experience, job satisfaction, and person-centered approaches to dementia care, giving a first indication for its validity. The SCIDS scale provides a useful and user-friendly means of measuring sense of competence in care staff. It has been developed using a robust process and has adequate psychometric properties. Further exploration of the construct and the scale's validity is warranted. It may be useful to assess the impact of training and perceived abilities and skills in dementia care.
Patient autonomy in multiple sclerosis--possible goals and assessment strategies.
Heesen, C; Köpke, S; Solari, A; Geiger, F; Kasper, J
2013-08-15
Patient autonomy has been increasingly acknowledged as prerequisite for successful medical decision making in Western countries. In medical decisions with a need to involve a health professional, patient autonomy becomes apparent in the extent of patients' participation in the communication as described in the concept of shared decision making. Patient autonomy can be derived from different perspectives or goals and the focus of evaluation approaches may vary accordingly. Multiple sclerosis (MS) is a paradigmatic disease to study patient autonomy mainly because MS patients are highly disease competent and due to ambiguous evidence on many aspects of disease-related medical decision making. This review gives an overview on measurement issues in studying decision making in MS, categorized according to prerequisites, process measures and outcomes of patient autonomy. As relevant prerequisites role preferences, risk attribution, risk tolerance, and risk knowledge are discussed. Regarding processes, we distinguish intra-psychic and interpersonal aspects. Intra-psychic processes are elucidated using the theory of planned behavior, which guided development of a 30-item scale to capture decisions about immunotherapy. Moreover, a theory of uncertainty management has been created resulting in the development of a corresponding measurement concept. Interpersonal processes evolving between physician and patient can be thoroughly analyzed from different perspectives by use of the newly developed comprehensive MAPPIN'SDM inventory. Concerning outcomes, besides health related outcomes, we discuss match of preferred roles during the decision encounters (preference match), decisional conflict as well as an application of the multidimensional measure of informed choice to decisions of MS patients. These approaches provide an overview on patient-inherent and interpersonal factors and processes modulating medical decision making and health behavior in MS and beyond. Copyright © 2013. Published by Elsevier B.V.
Stabilization of glucose-oxidase in the graphene paste for screen-printed glucose biosensor
NASA Astrophysics Data System (ADS)
Pepłowski, Andrzej; Janczak, Daniel; Jakubowska, Małgorzata
2015-09-01
Various methods and materials for enzyme stabilization within screen-printed graphene sensor were analyzed. Main goal was to develop technology allowing immediate printing of the biosensors in single printing process. Factors being considered were: toxicity of the materials used, ability of the material to be screen-printed (squeezed through the printing mesh) and temperatures required in the fabrication process. Performance of the examined sensors was measured using chemical amperometry method, then appropriate analysis of the measurements was conducted. The analysis results were then compared with the medical requirements. Parameters calculated were: correlation coefficient between concentration of the analyte and the measured electrical current (0.986) and variation coefficient for the particular concentrations of the analyte used as the calibration points. Variation of the measured values was significant only in ranges close to 0, decreasing for the concentrations of clinical importance. These outcomes justify further development of the graphene-based biosensors fabricated through printing techniques.
NASA Astrophysics Data System (ADS)
Zhao, H.; Zhang, S.
2008-01-01
One of the most effective means to achieve controlled auto-ignition (CAI) combustion in a gasoline engine is by the residual gas trapping method. The amount of residual gas and mixture composition have significant effects on the subsequent combustion process and engine emissions. In order to obtain quantitative measurements of in-cylinder residual gas concentration and air/fuel ratio, a spontaneous Raman scattering (SRS) system has been developed recently. The optimized optical SRS setups are presented and discussed. The temperature effect on the SRS measurement is considered and a method has been developed to correct for the overestimated values due to the temperature effect. Simultaneous measurements of O2, H2O, CO2 and fuel were obtained throughout the intake, compression, combustion and expansion strokes. It shows that the SRS can provide valuable data on this process in a CAI combustion engine.
Dynamic single photon emission computed tomography—basic principles and cardiac applications
Gullberg, Grant T; Reutter, Bryan W; Sitek, Arkadiusz; Maltz, Jonathan S; Budinger, Thomas F
2011-01-01
The very nature of nuclear medicine, the visual representation of injected radiopharmaceuticals, implies imaging of dynamic processes such as the uptake and wash-out of radiotracers from body organs. For years, nuclear medicine has been touted as the modality of choice for evaluating function in health and disease. This evaluation is greatly enhanced using single photon emission computed tomography (SPECT), which permits three-dimensional (3D) visualization of tracer distributions in the body. However, to fully realize the potential of the technique requires the imaging of in vivo dynamic processes of flow and metabolism. Tissue motion and deformation must also be addressed. Absolute quantification of these dynamic processes in the body has the potential to improve diagnosis. This paper presents a review of advancements toward the realization of the potential of dynamic SPECT imaging and a brief history of the development of the instrumentation. A major portion of the paper is devoted to the review of special data processing methods that have been developed for extracting kinetics from dynamic cardiac SPECT data acquired using rotating detector heads that move as radiopharmaceuticals exchange between biological compartments. Recent developments in multi-resolution spatiotemporal methods enable one to estimate kinetic parameters of compartment models of dynamic processes using data acquired from a single camera head with slow gantry rotation. The estimation of kinetic parameters directly from projection measurements improves bias and variance over the conventional method of first reconstructing 3D dynamic images, generating time–activity curves from selected regions of interest and then estimating the kinetic parameters from the generated time–activity curves. Although the potential applications of SPECT for imaging dynamic processes have not been fully realized in the clinic, it is hoped that this review illuminates the potential of SPECT for dynamic imaging, especially in light of new developments that enable measurement of dynamic processes directly from projection measurements. PMID:20858925
NASA Astrophysics Data System (ADS)
Gullberg, Grant T.; Reutter, Bryan W.; Sitek, Arkadiusz; Maltz, Jonathan S.; Budinger, Thomas F.
2010-10-01
The very nature of nuclear medicine, the visual representation of injected radiopharmaceuticals, implies imaging of dynamic processes such as the uptake and wash-out of radiotracers from body organs. For years, nuclear medicine has been touted as the modality of choice for evaluating function in health and disease. This evaluation is greatly enhanced using single photon emission computed tomography (SPECT), which permits three-dimensional (3D) visualization of tracer distributions in the body. However, to fully realize the potential of the technique requires the imaging of in vivo dynamic processes of flow and metabolism. Tissue motion and deformation must also be addressed. Absolute quantification of these dynamic processes in the body has the potential to improve diagnosis. This paper presents a review of advancements toward the realization of the potential of dynamic SPECT imaging and a brief history of the development of the instrumentation. A major portion of the paper is devoted to the review of special data processing methods that have been developed for extracting kinetics from dynamic cardiac SPECT data acquired using rotating detector heads that move as radiopharmaceuticals exchange between biological compartments. Recent developments in multi-resolution spatiotemporal methods enable one to estimate kinetic parameters of compartment models of dynamic processes using data acquired from a single camera head with slow gantry rotation. The estimation of kinetic parameters directly from projection measurements improves bias and variance over the conventional method of first reconstructing 3D dynamic images, generating time-activity curves from selected regions of interest and then estimating the kinetic parameters from the generated time-activity curves. Although the potential applications of SPECT for imaging dynamic processes have not been fully realized in the clinic, it is hoped that this review illuminates the potential of SPECT for dynamic imaging, especially in light of new developments that enable measurement of dynamic processes directly from projection measurements.
Application of CMOS Technology to Silicon Photomultiplier Sensors
D’Ascenzo, Nicola; Zhang, Xi; Xie, Qingguo
2017-01-01
We use the 180 nm GLOBALFOUNDRIES (GF) BCDLite CMOS process for the production of a silicon photomultiplier prototype. We study the main characteristics of the developed sensor in comparison with commercial SiPMs obtained in custom technologies and other SiPMs developed with CMOS-compatible processes. We support our discussion with a transient modeling of the detection process of the silicon photomultiplier as well as with a series of static and dynamic experimental measurements in dark and illuminated environments. PMID:28946675
Vaccarino, Anthony L; Evans, Kenneth R; Kalali, Amir H; Kennedy, Sidney H; Engelhardt, Nina; Frey, Benicio N; Greist, John H; Kobak, Kenneth A; Lam, Raymond W; MacQueen, Glenda; Milev, Roumen; Placenza, Franca M; Ravindran, Arun V; Sheehan, David V; Sills, Terrence; Williams, Janet B W
2016-01-01
The Depression Inventory Development project is an initiative of the International Society for CNS Drug Development whose goal is to develop a comprehensive and psychometrically sound measurement tool to be utilized as a primary endpoint in clinical trials for major depressive disorder. Using an iterative process between field testing and psychometric analysis and drawing upon expertise of international researchers in depression, the Depression Inventory Development team has established an empirically driven and collaborative protocol for the creation of items to assess symptoms in major depressive disorder. Depression-relevant symptom clusters were identified based on expert clinical and patient input. In addition, as an aid for symptom identification and item construction, the psychometric properties of existing clinical scales (assessing depression and related indications) were evaluated using blinded datasets from pharmaceutical antidepressant drug trials. A series of field tests in patients with major depressive disorder provided the team with data to inform the iterative process of scale development. We report here an overview of the Depression Inventory Development initiative, including results of the third iteration of items assessing symptoms related to anhedonia, cognition, fatigue, general malaise, motivation, anxiety, negative thinking, pain and appetite. The strategies adopted from the Depression Inventory Development program, as an empirically driven and collaborative process for scale development, have provided the foundation to develop and validate measurement tools in other therapeutic areas as well.
Digital image processing of bone - Problems and potentials
NASA Technical Reports Server (NTRS)
Morey, E. R.; Wronski, T. J.
1980-01-01
The development of a digital image processing system for bone histomorphometry and fluorescent marker monitoring is discussed. The system in question is capable of making measurements of UV or light microscope features on a video screen with either video or computer-generated images, and comprises a microscope, low-light-level video camera, video digitizer and display terminal, color monitor, and PDP 11/34 computer. Capabilities demonstrated in the analysis of an undecalcified rat tibia include the measurement of perimeter and total bone area, and the generation of microscope images, false color images, digitized images and contoured images for further analysis. Software development will be based on an existing software library, specifically the mini-VICAR system developed at JPL. It is noted that the potentials of the system in terms of speed and reliability far exceed any problems associated with hardware and software development.
2003-06-26
VANDENBERG AIR FORCE BASE, CALIF. - The SciSat-1 spacecraft is revealed at Vandenberg Air Force Base, Calif. Sci-Sat, which will undergo instrument checkout and spacecraft functional testing, weighs approximately 330 pounds and after launch will be placed in a 400-mile-high polar orbit to investigate processes that control the distribution of ozone in the upper atmosphere. The data from the satellite will provide Canadian and international scientists with improved measurements relating to global ozone processes and help policymakers assess existing environmental policy and develop protective measures for improving the health of our atmosphere, preventing further ozone depletion. The mission is designed to last two years.
2003-08-09
VANDENBERG AIR FORCE BASE, CALIF. - The Pegasus transporter, with its cargo of the SciSat-1 payload and Pegasus launch vehicle, moves under the L-1011 carrier aircraft for matting. The SciSat-1 weighs approximately 330 pounds and after launch will be placed in a 400-mile-high polar orbit to investigate processes that control the distribution of ozone in the upper atmosphere. The data from the satellite will provide Canadian and international scientists with improved measurements relating to global ozone processes and help policymakers assess existing environmental policy and develop protective measures for improving the health of our atmosphere, preventing further ozone depletion. The mission is designed to last two years.
2003-08-12
VANDENBERG AIR FORCE BASE, CALIF. - The L-1011 carrier aircraft is in flight with its cargo underneath of the Pegasus launch vehicle and SciSat-1 spacecraft. The SciSat-1 weighs approximately 330 pounds and after launch will be placed in a 400-mile-high polar orbit to investigate processes that control the distribution of ozone in the upper atmosphere. The data from the satellite will provide Canadian and international scientists with improved measurements relating to global ozone processes and help policymakers assess existing environmental policy and develop protective measures for improving the health of our atmosphere, preventing further ozone depletion. The mission is designed to last two years.
2003-08-12
VANDENBERG AIR FORCE BASE, CALIF. - The L-1011 carrier aircraft is in flight with its cargo of the Pegasus launch vehicle and SciSat-1 spacecraft underneath. The SciSat-1 weighs approximately 330 pounds and after launch will be placed in a 400-mile-high polar orbit to investigate processes that control the distribution of ozone in the upper atmosphere. The data from the satellite will provide Canadian and international scientists with improved measurements relating to global ozone processes and help policymakers assess existing environmental policy and develop protective measures for improving the health of our atmosphere, preventing further ozone depletion. The mission is designed to last two years.
2003-08-09
VANDENBERG AIR FORCE BASE, CALIF. - The SciSat-1 payload, with fairing installed and attached to its Pegasus launch vehicle, arrives at the pad for mating to the L-1011 carrier aircraft. The SciSat-1 weighs approximately 330 pounds and after launch will be placed in a 400-mile-high polar orbit to investigate processes that control the distribution of ozone in the upper atmosphere. The data from the satellite will provide Canadian and international scientists with improved measurements relating to global ozone processes and help policymakers assess existing environmental policy and develop protective measures for improving the health of our atmosphere, preventing further ozone depletion. The mission is designed to last two years.
2003-07-29
VANDENBERG AIR FORCE BASE, CALIF. - With its cover removed, the SciSat-1 spacecraft is rotated. The solar arrays will be attached and the communications systems checked out. The SciSat-1 weighs approximately 330 pounds and after launch will be placed in a 400-mile-high polar orbit to investigate processes that control the distribution of ozone in the upper atmosphere. The data from the satellite will provide Canadian and international scientists with improved measurements relating to global ozone processes and help policymakers assess existing environmental policy and develop protective measures for improving the health of our atmosphere, preventing further ozone depletion. The mission is designed to last two years.
2003-06-26
VANDENBERG AIR FORCE BASE, CALIF.- The cover is being lifted off SciSat-1 spacecraft at Vandenberg Air Force Base, Calif. Sci-Sat, which will undergo instrument checkout and spacecraft functional testing, weighs approximately 330 pounds and after launch will be placed in a 400-mile-high polar orbit to investigate processes that control the distribution of ozone in the upper atmosphere. The data from the satellite will provide Canadian and international scientists with improved measurements relating to global ozone processes and help policymakers assess existing environmental policy and develop protective measures for improving the health of our atmosphere, preventing further ozone depletion. The mission is designed to last two years.
Carlson, Judy; Cohen, Roslyn; Bice-Stephens, Wynona
2014-01-01
As a part of our nation's pursuit of improvements in patient care outcomes, continuity of care, and cost containment, the case manager has become a vital member on interdisciplinary teams and in health care agencies. Telebehavioral health programs, as a relatively new method of delivering behavioral health care, have recently begun to incorporate case management into their multidisciplinary teams. To determine the efficacy and efficiency of healthcare programs, program managers are charged with the determination of the outcomes of the care rendered to patient populations. However, programs that use telehealth methods to deliver care have unique structures in place that impact ability to collect outcome data. A military medical center that serves the Pacific region developed surveys and processes to distribute, administer, and collect information about a telehealth environment to obtain outcome data for the nurse case manager. This report describes the survey development and the processes created to capture nurse case manager outcomes. Additionally, the surveys and processes developed in this project for measuring outcomes may be useful in other settings and disciplines.
Acoustic emission measurements of aerospace materials and structures
NASA Technical Reports Server (NTRS)
Sachse, Wolfgang; Gorman, Michael R.
1993-01-01
A development status evaluation is given for aerospace applications of AE location, detection, and source characterization. Attention is given to the neural-like processing of AE signals for graphite/epoxy. It is recommended that development efforts for AE make connections between the material failure process and source dynamics, and study the effects of composite material anisotropy and inhomogeneity on the propagation of AE waves. Broadband, as well as frequency- and wave-mode selective sensors, need to be developed.
ERIC Educational Resources Information Center
Susac, Ana; Bubic, Andreja; Martinjak, Petra; Planinic, Maja; Palmovic, Marijan
2017-01-01
Developing a better understanding of the measurement process and measurement uncertainty is one of the main goals of university physics laboratory courses. This study investigated the influence of graphical representation of data on student understanding and interpreting of measurement results. A sample of 101 undergraduate students (48 first year…
NASA Astrophysics Data System (ADS)
Ischak, M.; Setioko, B.; Nurgandarum, D.
2018-01-01
The growth trend of Jakarta city as a Metropolitan city nowadays is the construction of large-scale planned settlement that is often referred to as a new town and is carried out by major developers. The process of land tenure and the process of constructing the new town are directly tangent to the original pre-existing settlements and shape the pattern or types of original settlements in the context of their relationship with the new town. This research was intended to measure the scale of sustainability due to land expansion by new town developers and was measured from the side of the original settlers who still exist. The research method used was descriptive explorative that is by formulating sustainability criteria that match best with research context and using the criteria as a tool to measure the sustainability level of new city development at research site that is new town of Gading Serpong Tangerang. The research concludes that despite the apparent displacement and restriction of original settlement’ lands, it indicates, overall, that new town development meets sustainability criteria when viewed from the residents of three types of the original settlements.
Developing a Web-Based Nursing Practice and Research Information Management System: A Pilot Study.
Choi, Jeeyae; Lapp, Cathi; Hagle, Mary E
2015-09-01
Many hospital information systems have been developed and implemented to collect clinical data from the bedside and have used the information to improve patient care. Because of a growing awareness that the use of clinical information improves quality of care and patient outcomes, measuring tools (electronic and paper based) have been developed, but most of them require multiple steps of data collection and analysis. This necessitated the development of a Web-based Nursing Practice and Research Information Management System that processes clinical nursing data to measure nurses' delivery of care and its impact on patient outcomes and provides useful information to clinicians, administrators, researchers, and policy makers at the point of care. This pilot study developed a computer algorithm based on a falls prevention protocol and programmed the prototype Web-based Nursing Practice and Research Information Management System. It successfully measured performance of nursing care delivered and its impact on patient outcomes successfully using clinical nursing data from the study site. Although Nursing Practice and Research Information Management System was tested with small data sets, results of study revealed that it has the potential to measure nurses' delivery of care and its impact on patient outcomes, while pinpointing components of nursing process in need of improvement.
Melt-Pool Temperature and Size Measurement During Direct Laser Sintering
DOE Office of Scientific and Technical Information (OSTI.GOV)
List, III, Frederick Alyious; Dinwiddie, Ralph Barton; Carver, Keith
2017-08-01
Additive manufacturing has demonstrated the ability to fabricate complex geometries and components not possible with conventional casting and machining. In many cases, industry has demonstrated the ability to fabricate complex geometries with improved efficiency and performance. However, qualification and certification of processes is challenging, leaving companies to focus on certification of material though design allowable based approaches. This significantly reduces the business case for additive manufacturing. Therefore, real time monitoring of the melt pool can be used to detect the development of flaws, such as porosity or un-sintered powder and aid in the certification process. Characteristics of the melt poolmore » in the Direct Laser Sintering (DLS) process is also of great interest to modelers who are developing simulation models needed to improve and perfect the DLS process. Such models could provide a means to rapidly develop the optimum processing parameters for new alloy powders and optimize processing parameters for specific part geometries. Stratonics’ ThermaViz system will be integrated with the Renishaw DLS system in order to demonstrate its ability to measure melt pool size, shape and temperature. These results will be compared with data from an existing IR camera to determine the best approach for the determination of these critical parameters.« less
Acquisition of Programming Skills
1990-04-01
skills (e.g., arithmetic reasoning, work knowledge, information processing speed); and c) passive versus active learning style. Ability measures...concurrent storage and processing an individual was capable of doing), and an active learning style. Implications of the findings for the development of
Sarkar, Sumona; Lund, Steven P; Vyzasatya, Ravi; Vanguri, Padmavathy; Elliott, John T; Plant, Anne L; Lin-Gibson, Sheng
2017-12-01
Cell counting measurements are critical in the research, development and manufacturing of cell-based products, yet determining cell quantity with accuracy and precision remains a challenge. Validating and evaluating a cell counting measurement process can be difficult because of the lack of appropriate reference material. Here we describe an experimental design and statistical analysis approach to evaluate the quality of a cell counting measurement process in the absence of appropriate reference materials or reference methods. The experimental design is based on a dilution series study with replicate samples and observations as well as measurement process controls. The statistical analysis evaluates the precision and proportionality of the cell counting measurement process and can be used to compare the quality of two or more counting methods. As an illustration of this approach, cell counting measurement processes (automated and manual methods) were compared for a human mesenchymal stromal cell (hMSC) preparation. For the hMSC preparation investigated, results indicated that the automated method performed better than the manual counting methods in terms of precision and proportionality. By conducting well controlled dilution series experimental designs coupled with appropriate statistical analysis, quantitative indicators of repeatability and proportionality can be calculated to provide an assessment of cell counting measurement quality. This approach does not rely on the use of a reference material or comparison to "gold standard" methods known to have limited assurance of accuracy and precision. The approach presented here may help the selection, optimization, and/or validation of a cell counting measurement process. Published by Elsevier Inc.
NASA Astrophysics Data System (ADS)
Fang, Jun
Thermotropic liquid crystalline polymers (TLCPs) are a class of promising engineering materials for high-demanding structural applications. Their excellent mechanical properties are highly correlated to the underlying molecular orientation states, which may be affected by complex flow fields during melt processing. Thus, understanding and eventually predicting how processing flows impact molecular orientation is a critical step towards rational design work in order to achieve favorable, balanced physical properties in finished products. This thesis aims to develop deeper understanding of orientation development in commercial TLCPs during processing by coordinating extensive experimental measurements with numerical computations. In situ measurements of orientation development of LCPs during processing are a focal point of this thesis. An x-ray capable injection molding apparatus is enhanced and utilized for time-resolved measurements of orientation development in multiple commercial TLCPs during injection molding. Ex situ wide angle x-ray scattering is also employed for more thorough characterization of molecular orientation distributions in molded plaques. Incompletely injection molded plaques ("short shots") are studied to gain further insights into the intermediate orientation states during mold filling. Finally, two surface orientation characterization techniques, near edge x-ray absorption fine structure (NEXAFS) and infrared attenuated total reflectance (FTIR-ATR) are combined to investigate the surface orientation distribution of injection molded plaques. Surface orientation states are found to be vastly different from their bulk counterparts due to different kinematics involved in mold filling. In general, complex distributions of orientation in molded plaques reflect the spatially varying competition between shear and extension during mold filling. To complement these experimental measurements, numerical calculations based on the Larson-Doi polydomain model are performed. The implementation of the Larson-Doi in complex processing flows is performed using a commercial process modeling software suite (MOLDFLOWRTM), exploiting a nearly exact analogy between the Larson-Doi model and a fiber orientation model that has been widely used in composites processing simulations. The modeling scheme is first verified by predicting many qualitative and quantitative features of molecular orientation distributions in isothermal extrusion-fed channel flows. In coordination with experiments, the model predictions are found to capture many qualitative features observed in injection molded plaques (including short shots). The final, stringent test of Larson-Doi model performance is prediction of in situ transient orientation data collected during mold filling. The model yields satisfactory results, though certain numerical approximations limit performance near the mold front.
NASA Astrophysics Data System (ADS)
Makarycheva, A. I.; Faerman, V. A.
2017-02-01
The analyses of automation patterns is performed and the programming solution for the automation of data processing of the chromatographic data and their further information storage with a help of a software package, Mathcad and MS Excel spreadsheets, is developed. The offered approach concedes the ability of data processing algorithm modification and does not require any programming experts participation. The approach provides making a measurement of the given time and retention volumes, specific retention volumes, a measurement of differential molar free adsorption energy, and a measurement of partial molar solution enthalpies and isosteric heats of adsorption. The developed solution is focused on the appliance in a small research group and is tested on the series of some new gas chromatography sorbents. More than 20 analytes were submitted to calculation of retention parameters and thermodynamic sorption quantities. The received data are provided in the form accessible to comparative analysis, and they are able to find sorbing agents with the most profitable properties to solve some concrete analytic issues.
In-situ measurement of processing properties during fabrication in a production tool
NASA Technical Reports Server (NTRS)
Kranbuehl, D. E.; Haverty, P.; Hoff, M.; Loos, A. C.
1988-01-01
Progress is reported on the use of frequency-dependent electromagnetic measurements (FDEMs) as a single, convenient technique for continuous in situ monitoring of polyester cure during fabrication in a laboratory and manufacturing environment. Preliminary FDEM sensor and modeling work using the Loss-Springer model in order to develop an intelligent closed-loop, sensor-controlled cure process is described. FDEMs using impedance bridges in the Hz to MHz region is found to be ideal for automatically monitoring polyester processing properties continuously throughout the cure cycle.
NASA Technical Reports Server (NTRS)
Kranbuehl, D. E.; Delos, S. E.; Hoff, M. S.; Weller, L. W.; Haverty, P. D.
1987-01-01
An in situ NDE dielectric impedance measurement method has been developed for ascertaining the cure processing properties of high temperature advanced thermoplastic and thermosetting resins, using continuous frequency-dependent measurements and analyses of complex permittivity over 9 orders of magnitude and 6 decades of frequency at temperatures up to 400 C. Both ionic and Debye-like dipolar relaxation processes are monitored. Attention is given to LARC-TPI, PEEK, and poly(arylene ether) resins' viscosity, glass transition temperature, recrystallization, and residual solvent content and evolution properties.
Improvement of tritium accountancy technology for ITER fuel cycle safety enhancement
NASA Astrophysics Data System (ADS)
O'hira, S.; Hayashi, T.; Nakamura, H.; Kobayashi, K.; Tadokoro, T.; Nakamura, H.; Itoh, T.; Yamanishi, T.; Kawamura, Y.; Iwai, Y.; Arita, T.; Maruyama, T.; Kakuta, T.; Konishi, S.; Enoeda, M.; Yamada, M.; Suzuki, T.; Nishi, M.; Nagashima, T.; Ohta, M.
2000-03-01
In order to improve the safe handling and control of tritium for the ITER fuel cycle, effective in situ tritium accounting methods have been developed at the Tritium Process Laboratory in the Japan Atomic Energy Research Institute under one of the ITER-EDA R&D tasks. The remote and multilocation analysis of process gases by an application of laser Raman spectroscopy developed and tested could provide a measurement of hydrogen isotope gases with a detection limit of 0.3 kPa analytical periods of 120 s. An in situ tritium inventory measurement by application of a `self-assaying' storage bed with 25 g tritium capacity could provide a measurement with the required detection limit of less than 1% and a design proof of a bed with 100 g tritium capacity.
Signal processing methods for MFE plasma diagnostics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Candy, J.V.; Casper, T.; Kane, R.
1985-02-01
The application of various signal processing methods to extract energy storage information from plasma diamagnetism sensors occurring during physics experiments on the Tandom Mirror Experiment-Upgrade (TMX-U) is discussed. We show how these processing techniques can be used to decrease the uncertainty in the corresponding sensor measurements. The algorithms suggested are implemented using SIG, an interactive signal processing package developed at LLNL.
Report #13-P-0167, February 28, 2013. Rule development is one of the Agency’s principal tasks. EPA develops rules to carry out the environmental and public health protection laws passed by Congress.
Enterprise Professional Development--Evaluating Learning
ERIC Educational Resources Information Center
Murphy, Gerald A.; Calway, Bruce A.
2010-01-01
Whilst professional development (PD) is an activity required by many regulatory authorities, the value that enterprises obtain from PD is often unknown, particularly when it involves development of knowledge. This paper discusses measurement techniques and processes and provides a review of established evaluation techniques, highlighting…
A process improvement model for software verification and validation
NASA Technical Reports Server (NTRS)
Callahan, John; Sabolish, George
1994-01-01
We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and space station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.
A process improvement model for software verification and validation
NASA Technical Reports Server (NTRS)
Callahan, John; Sabolish, George
1994-01-01
We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and Space Station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.
A new multiple air beam approach for in-process form error optical measurement
NASA Astrophysics Data System (ADS)
Gao, Y.; Li, R.
2018-07-01
In-process measurement can provide feedback for the control of workpiece precision in terms of size, roughness and, in particular, mid-spatial frequency form error. Optical measurement methods are of the non-contact type and possess high precision, as required for in-process form error measurement. In precision machining, coolant is commonly used to reduce heat generation and thermal deformation on the workpiece surface. However, the use of coolant will induce an opaque coolant barrier if optical measurement methods are used. In this paper, a new multiple air beam approach is proposed. The new approach permits the displacement of coolant from any direction and with a large thickness, i.e. with a large amount of coolant. The model, the working principle, and the key features of the new approach are presented. Based on the proposed new approach, a new in-process form error optical measurement system is developed. The coolant removal capability and the performance of this new multiple air beam approach are assessed. The experimental results show that the workpiece surface y(x, z) can be measured successfully with standard deviation up to 0.3011 µm even under a large amount of coolant, such that the coolant thickness is 15 mm. This means a relative uncertainty of 2σ up to 4.35% and the workpiece surface is deeply immersed in the opaque coolant. The results also show that, in terms of coolant removal capability, air supply and air velocity, the proposed new approach improves by, respectively, 3.3, 1.3 and 5.3 times on the previous single air beam approach. The results demonstrate the significant improvements brought by the new multiple air beam method together with the developed measurement system.
Measuring emotion socialization in schools.
Horner, Christy G; Wallace, Tanner L
2013-10-01
Understanding how school personnel can best support students' development of communication skills around feelings is critical to long-term health outcomes. The measurement of emotion socialization in schools facilitates future research in this area; we review existing measures of emotion socialization to assess their applicability to school-based health studies. A content analysis of four emotion socialization measures was conducted. Inclusion criteria included: high frequency of use in prior research, established documentation of validity and reliability, and sufficient description of measurement procedures. Four dimensions emerged as particularly salient to a measure's future relevance and applicability to school-based health studies: (1) methods of measurement; (2) mode and agent of socialization; (3) type of emotion; and (4) structure versus function of socializing behavior. Future measurement strategies should address (1) the structures of emotion socializing processes; (2) diverse socializing agents such as teachers, peers, and administrators; (3) the intended functions of such processes; (4) student perceptions of and responses to such processes; and (5) the complex interactions of these factors across contexts. Strategies attending to these components will permit future studies of school-based emotion socializing processes to determine how they enhance health and reduce health risks. © 2013, American School Health Association.
2003-11-05
KENNEDY SPACE CENTER, FLA. - The Japanese Experiment Module (JEM) is moved on its workstand in the Space Station Processing Facility. The JEM will undergo pre-assembly measurements. Developed by the Japan Aerospace Exploration Agency (JAXA), the JEM will enhance the unique research capabilities of the orbiting complex by providing an additional environment for astronauts to conduct science experiments.
Langley Wind Tunnel Data Quality Assurance-Check Standard Results
NASA Technical Reports Server (NTRS)
Hemsch, Michael J.; Grubb, John P.; Krieger, William B.; Cler, Daniel L.
2000-01-01
A framework for statistical evaluation, control and improvement of wind funnel measurement processes is presented The methodology is adapted from elements of the Measurement Assurance Plans developed by the National Bureau of Standards (now the National Institute of Standards and Technology) for standards and calibration laboratories. The present methodology is based on the notions of statistical quality control (SQC) together with check standard testing and a small number of customer repeat-run sets. The results of check standard and customer repeat-run -sets are analyzed using the statistical control chart-methods of Walter A. Shewhart long familiar to the SQC community. Control chart results are presented for. various measurement processes in five facilities at Langley Research Center. The processes include test section calibration, force and moment measurements with a balance, and instrument calibration.
Measurement of actinides and strontium-90 in high activity waste
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maxwell, S.L. III; Nelson, M.R.
1994-08-01
The reliable measurement of trace radionuclides in high activity waste is important to support waste processing activities at SRS (F and H Area Waste Tanks, Extended Sludge Processing (ESP) and In-Tank precipitation (ITP) processing). Separation techniques are needed to remove high levels of gamma activity and alpha/beta interferences prior to analytical measurement. Using new extraction chromatographic resins from EiChrom Industries, Inc., the SRS Central Laboratory has developed new high speed separation methods that enable measurement of neptunium, thorium, uranium, plutonium, americium and strontium-90 in high activity waste solutions. Small particle size resin and applied vacuum are used to reduce analysismore » times and enhance column performance. Extraction chromatographic resins are easy to use and eliminate the generation of contaminated liquid organic waste.« less
Behavioral similarity measurement based on image processing for robots that use imitative learning
NASA Astrophysics Data System (ADS)
Sterpin B., Dante G.; Martinez S., Fernando; Jacinto G., Edwar
2017-02-01
In the field of the artificial societies, particularly those are based on memetics, imitative behavior is essential for the development of cultural evolution. Applying this concept for robotics, through imitative learning, a robot can acquire behavioral patterns from another robot. Assuming that the learning process must have an instructor and, at least, an apprentice, the fact to obtain a quantitative measurement for their behavioral similarity, would be potentially useful, especially in artificial social systems focused on cultural evolution. In this paper the motor behavior of both kinds of robots, for two simple tasks, is represented by 2D binary images, which are processed in order to measure their behavioral similarity. The results shown here were obtained comparing some similarity measurement methods for binary images.
Methods of Measurement for Semiconductor Materials, Process Control, and Devices
NASA Technical Reports Server (NTRS)
Bullis, W. M. (Editor)
1973-01-01
The development of methods of measurement for semiconductor materials, process control, and devices is reported. Significant accomplishments include: (1) Completion of an initial identification of the more important problems in process control for integrated circuit fabrication and assembly; (2) preparations for making silicon bulk resistivity wafer standards available to the industry; and (3) establishment of the relationship between carrier mobility and impurity density in silicon. Work is continuing on measurement of resistivity of semiconductor crystals; characterization of generation-recombination-trapping centers, including gold, in silicon; evaluation of wire bonds and die attachment; study of scanning electron microscopy for wafer inspection and test; measurement of thermal properties of semiconductor devices; determination of S-parameters and delay time in junction devices; and characterization of noise and conversion loss of microwave detector diodes.
NASA Astrophysics Data System (ADS)
Tucker, Brian J.; Diaz, Aaron A.; Eckenrode, Brian A.
2006-03-01
Government agencies and homeland security related organizations have identified the need to develop and establish a wide range of unprecedented capabilities for providing scientific and technical forensic services to investigations involving hazardous chemical, biological, and radiological materials, including extremely dangerous chemical and biological warfare agents. Pacific Northwest National Laboratory (PNNL) has developed a prototype portable, hand-held, hazardous materials acoustic inspection prototype that provides noninvasive container interrogation and material identification capabilities using nondestructive ultrasonic velocity and attenuation measurements. Due to the wide variety of fluids as well as container sizes and materials encountered in various law enforcement inspection activities, the need for high measurement sensitivity and advanced ultrasonic measurement techniques were identified. The prototype was developed using a versatile electronics platform, advanced ultrasonic wave propagation methods, and advanced signal processing techniques. This paper primarily focuses on the ultrasonic measurement methods and signal processing techniques incorporated into the prototype. High bandwidth ultrasonic transducers combined with an advanced pulse compression technique allowed researchers to 1) obtain high signal-to-noise ratios and 2) obtain accurate and consistent time-of-flight (TOF) measurements through a variety of highly attenuative containers and fluid media. Results of work conducted in the laboratory have demonstrated that the prototype experimental measurement technique also provided information regarding container properties, which will be utilized in future container-independent measurements of hidden liquids.
QoS measurement of workflow-based web service compositions using Colored Petri net.
Nematzadeh, Hossein; Motameni, Homayun; Mohamad, Radziah; Nematzadeh, Zahra
2014-01-01
Workflow-based web service compositions (WB-WSCs) is one of the main composition categories in service oriented architecture (SOA). Eflow, polymorphic process model (PPM), and business process execution language (BPEL) are the main techniques of the category of WB-WSCs. Due to maturity of web services, measuring the quality of composite web services being developed by different techniques becomes one of the most important challenges in today's web environments. Business should try to provide good quality regarding the customers' requirements to a composed web service. Thus, quality of service (QoS) which refers to nonfunctional parameters is important to be measured since the quality degree of a certain web service composition could be achieved. This paper tried to find a deterministic analytical method for dependability and performance measurement using Colored Petri net (CPN) with explicit routing constructs and application of theory of probability. A computer tool called WSET was also developed for modeling and supporting QoS measurement through simulation.
Torque Measurement at the Single Molecule Level
Forth, Scott; Sheinin, Maxim Y.; Inman, James; Wang, Michelle D.
2017-01-01
Methods for exerting and measuring forces on single molecules have revolutionized the study of the physics of biology. However, it is often the case that biological processes involve rotation or torque generation, and these parameters have been more difficult to access experimentally. Recent advances in the single molecule field have led to the development of techniques which add the capability of torque measurement. By combining force, displacement, torque, and rotational data, a more comprehensive description of the mechanics of a biomolecule can be achieved. In this review, we highlight a number of biological processes for which torque plays a key mechanical role. We describe the various techniques that have been developed to directly probe the torque experienced by a single molecule, and detail a variety of measurements made to date using these new technologies. We conclude by discussing a number of open questions and propose systems of study which would be well suited for analysis with torsional measurement techniques. PMID:23541162
Morgan, Lauren; New, Steve; Robertson, Eleanor; Collins, Gary; Rivero-Arias, Oliver; Catchpole, Ken; Pickering, Sharon P; Hadi, Mohammed; Griffin, Damian; McCulloch, Peter
2015-02-01
Standard operating procedures (SOPs) should improve safety in the operating theatre, but controlled studies evaluating the effect of staff-led implementation are needed. In a controlled interrupted time series, we evaluated three team process measures (compliance with WHO surgical safety checklist, non-technical skills and technical performance) and three clinical outcome measures (length of hospital stay, complications and readmissions) before and after a 3-month staff-led development of SOPs. Process measures were evaluated by direct observation, using Oxford Non-Technical Skills II for non-technical skills and the 'glitch count' for technical performance. All staff in two orthopaedic operating theatres were trained in the principles of SOPs and then assisted to develop standardised procedures. Staff in a control operating theatre underwent the same observations but received no training. The change in difference between active and control groups was compared before and after the intervention using repeated measures analysis of variance. We observed 50 operations before and 55 after the intervention and analysed clinical data on 1022 and 861 operations, respectively. The staff chose to structure their efforts around revising the 'whiteboard' which documented and prompted tasks, rather than directly addressing specific task problems. Although staff preferred and sustained the new system, we found no significant differences in process or outcome measures before/after intervention in the active versus the control group. There was a secular trend towards worse outcomes in the postintervention period, seen in both active and control theatres. SOPs when developed and introduced by frontline staff do not necessarily improve operative processes or outcomes. The inherent tension in improvement work between giving staff ownership of improvement and maintaining control of direction needs to be managed, to ensure staff are engaged but invest energy in appropriate change. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Measurement of Global Precipitation
NASA Technical Reports Server (NTRS)
Flaming, Gilbert Mark
2004-01-01
The Global Precipitation Measurement (GPM) Program is an international cooperative effort whose objectives are to (a) obtain increased understanding of rainfall processes, and (b) make frequent rainfall measurements on a global basis. The National Aeronautics and Space Administration (NASA) of the United States and the Japanese Aviation and Exploration Agency (JAXA) have entered into a cooperative agreement for the formulation and development of GPM. This agreement is a continuation of the partnership that developed the highly successful Tropical Rainfall Measuring Mission (TRMM) that was launched in November 1997; this mission continues to provide valuable scientific and meteorological information on rainfall and the associated processes. International collaboration on GPM from other space agencies has been solicited, and discussions regarding their participation are currently in progress. NASA has taken lead responsibility for the planning and formulation of GPM, Key elements of the Program to be provided by NASA include a Core satellite bus instrumented with a multi-channel microwave radiometer, a Ground Validation System and a ground-based Precipitation Processing System (PPS). JAXA will provide a Dual-frequency Precipitation Radar for installation on the Core satellite and launch services. Other United States agencies and international partners may participate in a number of ways, such as providing rainfall measurements obtained from their own national space-borne platforms, providing local rainfall measurements to support the ground validation activities, or providing hardware or launch services for GPM constellation spacecraft. This paper will present an overview of the current planning for the GPM Program, and discuss in more detail the status of the lead author's primary responsibility, development and acquisition of the GPM Microwave Imager.
Experimental validation of pulsed column inventory estimators
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beyerlein, A.L.; Geldard, J.F.; Weh, R.
Near-real-time accounting (NRTA) for reprocessing plants relies on the timely measurement of all transfers through the process area and all inventory in the process. It is difficult to measure the inventory of the solvent contractors; therefore, estimation techniques are considered. We have used experimental data obtained at the TEKO facility in Karlsruhe and have applied computer codes developed at Clemson University to analyze this data. For uranium extraction, the computer predictions agree to within 15% of the measured inventories. We believe this study is significant in demonstrating that using theoretical models with a minimum amount of process data may bemore » an acceptable approach to column inventory estimation for NRTA. 15 refs., 7 figs.« less
Developing a Set of Uniform Outcome Measures for Adult Day Services.
Anderson, Keith A; Geboy, Lyn; Jarrott, Shannon E; Missaelides, Lydia; Ogletree, Aaron M; Peters-Beumer, Lisa; Zarit, Steven H
2018-06-01
Adult day services (ADS) provide care to adults with physical, functional, and/or cognitive limitations in nonresidential, congregate, community-based settings. ADS programs have emerged as a growing and affordable approach within the home and community-based services sector. Although promising, the growth of ADS has been hampered by a lack of uniform outcome measures and data collection protocols. In this article, the authors detail a recent effort by leading researchers and practitioners in ADS to develop a set of uniform outcome measures. Based upon three recent efforts to develop outcome measures, selection criteria were established and an iterative process was conducted to debate the merits of outcome measures across three domains-participant well-being, caregiver well-being, and health care utilization. The authors conclude by proposing a uniform set of outcome measures to (a) standardize data collection, (b) aid in the development of programming, and (c) facilitate the leveraging of additional funding for ADS.
ERIC Educational Resources Information Center
Holt, Maurice
1995-01-01
The central idea in W. Edwards Deming's approach to quality management is the need to improve process. Outcome-based education's central defect is its failure to address process. Deming would reject OBE along with management-by-objectives. Education is not a product defined by specific output measures, but a process to develop the mind. (MLH)
Outcome-Based School-to-Work Transition Planning for Students with Severe Disabilities.
ERIC Educational Resources Information Center
Steere, Daniel E.; And Others
1990-01-01
A transition planning process that focuses on quality-of-life outcomes is presented. The process, which views employment not as an outcome but as a vehicle for the attainment of quality of life, involves six steps: orientation, personal profile development, identification of employment outcomes, measurement system, compatibility process, and…
The Architecture, Dynamics, and Development of Mental Processing: Greek, Chinese, or Universal?
ERIC Educational Resources Information Center
Demetriou, A.; Kui, Z.X.; Spanoudis, G.; Christou, C.; Kyriakides, L.; Platsidou, M.
2005-01-01
This study compared Greeks with Chinese, from 8 to 14 years of age, on measures of processing efficiency, working memory, and reasoning. All processes were addressed through three domains of relations: verbal/propositional, quantitative, and visuo/spatial. Structural equations modelling and rating scale analysis showed that the architecture and…
NBS (National Bureau of Standards): Materials measurements
NASA Technical Reports Server (NTRS)
Manning, J. R.
1984-01-01
Work in support of NASA's Microgravity Science and Applications Program is described. The results of the following three tasks are given in detail: (1) surface tensions and their variations with temperature and impurities; (2) convection during unidirectional solidification; and (3) measurement of high temperature thermophysical properties. Tasks 1 and 2 were directed toward determining how the reduced gravity obtained in space flight can affect convection and solidification processes. Emphasis in task 3 was on development of levitation and containerless processing techniques which can be applied in space flight to provide thermodynamic measurements of reactive materials.
Advanced Microscopic Integrated Thermocouple Arrays
NASA Technical Reports Server (NTRS)
Pettigrew, Penny J.
1999-01-01
The purpose of this research is to develop and refine a technique for making microscopic thermocouple arrays for use in measuring the temperature gradient across a solid-liquid interface during the solidification process. Current thermocouple technology does not allow for real-time measurements across the interface due to the prohibitive size of available thermocouples. Microscopic thermocouple arrays will offer a much greater accuracy and resolution of temperature measurements across the solid-liquid interface which will lead to a better characterization of the solidification process and interface reaction which affect the properties of the resulting material.
Conceptualising the effectiveness of impact assessment processes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chanchitpricha, Chaunjit, E-mail: chaunjit@g.sut.ac.th; Bond, Alan, E-mail: alan.bond@uea.ac.uk; Unit for Environmental Sciences and Management School of Geo and Spatial Sciences, Internal Box 375, North West University
2013-11-15
This paper aims at conceptualising the effectiveness of impact assessment processes through the development of a literature-based framework of criteria to measure impact assessment effectiveness. Four categories of effectiveness were established: procedural, substantive, transactive and normative, each containing a number of criteria; no studies have previously brought together all four of these categories into such a comprehensive, criteria-based framework and undertaken systematic evaluation of practice. The criteria can be mapped within a cycle/or cycles of evaluation, based on the ‘logic model’, at the stages of input, process, output and outcome to enable the identification of connections between the criteria acrossmore » the categories of effectiveness. This framework is considered to have potential application in measuring the effectiveness of many impact assessment processes, including strategic environmental assessment (SEA), environmental impact assessment (EIA), social impact assessment (SIA) and health impact assessment (HIA). -- Highlights: • Conceptualising effectiveness of impact assessment processes. • Identification of factors influencing effectiveness of impact assessment processes. • Development of criteria within a framework for evaluating IA effectiveness. • Applying the logic model to examine connections between effectiveness criteria.« less
Trivellin, Nicola; Barbisan, Diego; Badocco, Denis; Pastore, Paolo; Meneghesso, Gaudenzio; Meneghini, Matteo; Zanoni, Enrico; Belgioioso, Giuseppe; Cenedese, Angelo
2018-04-07
The importance of oxygen in the winemaking process is widely known, as it affects the chemical aspects and therefore the organoleptic characteristics of the final product. Hence, it is evident the usefulness of a continuous and real-time measurements of the levels of oxygen in the various stages of the winemaking process, both for monitoring and for control. The WOW project (Deployment of WSAN technology for monitoring Oxygen in Wine products) has focused on the design and the development of an innovative device for monitoring the oxygen levels in wine. This system is based on the use of an optical fiber to measure the luminescent lifetime variation of a reference metal/porphyrin complex, which decays in presence of oxygen. The developed technology results in a high sensitivity and low cost sensor head that can be employed for measuring the dissolved oxygen levels at several points inside a wine fermentation or aging tank. This system can be complemented with dynamic modeling techniques to provide predictive behavior of the nutrient evolution in space and time given few sampled measuring points, for both process monitoring and control purposes. The experimental validation of the technology has been first performed in a controlled laboratory setup to attain calibration and study sensitivity with respect to different photo-luminescent compounds and alcoholic or non-alcoholic solutions, and then in an actual case study during a measurement campaign at a renown Italian winery.
Trivellin, Nicola; Barbisan, Diego; Badocco, Denis; Pastore, Paolo; Meneghini, Matteo; Zanoni, Enrico; Belgioioso, Giuseppe
2018-01-01
The importance of oxygen in the winemaking process is widely known, as it affects the chemical aspects and therefore the organoleptic characteristics of the final product. Hence, it is evident the usefulness of a continuous and real-time measurements of the levels of oxygen in the various stages of the winemaking process, both for monitoring and for control. The WOW project (Deployment of WSAN technology for monitoring Oxygen in Wine products) has focused on the design and the development of an innovative device for monitoring the oxygen levels in wine. This system is based on the use of an optical fiber to measure the luminescent lifetime variation of a reference metal/porphyrin complex, which decays in presence of oxygen. The developed technology results in a high sensitivity and low cost sensor head that can be employed for measuring the dissolved oxygen levels at several points inside a wine fermentation or aging tank. This system can be complemented with dynamic modeling techniques to provide predictive behavior of the nutrient evolution in space and time given few sampled measuring points, for both process monitoring and control purposes. The experimental validation of the technology has been first performed in a controlled laboratory setup to attain calibration and study sensitivity with respect to different photo-luminescent compounds and alcoholic or non-alcoholic solutions, and then in an actual case study during a measurement campaign at a renown Italian winery. PMID:29642468
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thorne, N; Kassaee, A
Purpose: To develop an algorithm which can calculate the Full Width Half Maximum (FWHM) of a Proton Pencil Beam from a 2D dimensional ion chamber array (IBA Matrixx) with limited spatial resolution ( 7.6 mm inter chamber distance). The algorithm would allow beam FWHM measurements to be taken during daily QA without an appreciable time increase. Methods: Combinations of 147 MeV single spot beams were delivered onto an IBA Matrixx and concurrently on EBT3 films for a standard. Data were collected around the Bragg Peak region and evaluated by a custom MATLAB script based on our algorithm using a leastmore » squared analysis. A set of artificial data, modified with random noise, was also processed to test for robustness. Results: The Matlab script processed Matixx data shows acceptable agreement (within 5%) with film measurements with no single measurement differing by more than 1.8 mm. In cases where the spots show some degree of asymmetry, the algorithm is able to resolve the differences. The algorithm was able to process artificial data with noise up to 15% of the maximum value. Time assays of each measurement took less than 3 minutes to perform, indicating that such measurements may be efficiently added to daily QA treatment. Conclusion: The developed algorithm can be implemented in daily QA program for Proton Pencil Beam scanning beams (PBS) with Matrixx to extract spot size and position information. The developed algorithm may be extended to small field sizes in photon clinic.« less
2008-01-01
Self-initiated learning where you define the objective, pace , and process. How to Use This Handbook The contents of this handbook will help you...Your Strengths & Weaknesses Learning to Learn Move Forward & Measure Progress Where Should I Go? The Self-Development Process For further...or for a different career track altogether. Maybe you lack skills or knowledge. Or, maybe there is something you’ve just always wanted to learn or
A thermodynamic analysis of the environmental indicators of natural gas combustion processes
NASA Astrophysics Data System (ADS)
Elsukov, V. K.
2010-07-01
Environmental indicators of the natural gas combustion process are studied using the model of extreme intermediate states developed at the Melent’ev Institute of Power Engineering Systems. Technological factors responsible for generation of polycyclic aromatic hydrocarbons and hydrogen cyanide are revealed. Measures for reducing the amounts of polycyclic aromatic hydrocarbons, hydrogen cyanide, nitrogen oxide, and other pollutants emitted from boilers are developed.
2012-01-01
If public health agencies are to fulfill their overall mission, they need to have defined measurable targets and should structure services to reach these targets, rather than offer a combination of ill-targeted programs. In order to do this, it is essential that there be a clear definition of what public health should do- a definition that does not ebb and flow based upon the prevailing political winds, but rather is based upon professional standards and measurements. The establishment of the Essential Public Health Services framework in the U.S.A. was a major move in that direction, and the model, or revisions of the model, have been adopted beyond the borders of the U.S. This article reviews the U.S. public health system, the needs and processes which brought about the development of the 10 Essential Public Health Services (EPHS), and historical and contemporary applications of the model. It highlights the value of establishing a common delineation of public health activities such as those contained in the EPHS, and explores the validity of using the same process in other countries through a discussion of the development in Israel of a similar model, the 10 Public Health Essential Functions (PHEF), that describes the activities of Israel’s public health system. The use of the same process and framework to develop similar yet distinct frameworks suggests that the process has wide applicability, and may be beneficial to any public health system. Once a model is developed, it can be used to measure public health performance and improve the quality of services delivered through the development of standards and measures based upon the model, which could, ultimately, improve the health of the communities that depend upon public health agencies to protect their well-being. PMID:23181452
Examining Equivalence of Concepts and Measures in Diverse Samples
Choi, Yoonsun; Abbott, Robert D.; Catalano, Richard F.; Bliesner, Siri L.
2012-01-01
While there is growing awareness for the need to examine the etiology of problem behaviors across cultural, racial, socioeconomic, and gender groups, much research tends to assume that constructs are equivalent and that the measures developed within one group equally assess constructs across groups. The meaning of constructs, however, may differ across groups or, if similar in meaning, measures developed for a given construct in one particular group may not be assessing the same construct or may not be assessing the construct in the same manner in other groups. The aims of this paper were to demonstrate a process of testing several forms of equivalence including conceptual, functional, item, and scalar using different methods. Data were from the Cross-Cultural Families Project, a study examining factors that promote the healthy development and adjustment of children among immigrant Cambodian and Vietnamese families. The process described in this paper can be implemented in other prevention studies interested in diverse groups. Demonstrating equivalence of constructs and measures prior to group comparisons is necessary in order to lend support of our interpretation of issues such as ethnic group differences and similarities. PMID:16845592
Design and Development of a Three-Component Force Sensor for Milling Process Monitoring
Li, Yingxue; Zhao, Yulong; Fei, Jiyou; Qin, Yafei; Zhao, You; Cai, Anjiang; Gao, Song
2017-01-01
A strain-type three-component table dynamometer is presented in this paper, which reduces output errors produced by cutting forces imposed on the different milling positions of a workpiece. A sensor structure with eight parallel elastic beams is proposed, and sensitive regions and Wheastone measuring circuits are also designed in consideration of eliminating the influences of the eccentric forces. To evaluate the sensor decoupling performance, both of the static calibration and dynamic milling test were implemented in different positions of the workpiece. Static experiment results indicate that the maximal deviation between the measured forces and the standard inputs is 4.58%. Milling tests demonstrate that with same machining parameters, the differences of the measured forces between different milling positions derived by the developed sensor are no larger than 6.29%. In addition, the natural frequencies of the dynamometer are kept higher than 2585.5 Hz. All the measuring results show that as a strain-type dynamometer, the developed force sensor has an improved eccentric decoupling accuracy with natural frequencies not much decreased, which owns application potential in milling process monitoring. PMID:28441354
Experimental evaluation of tool run-out in micro milling
NASA Astrophysics Data System (ADS)
Attanasio, Aldo; Ceretti, Elisabetta
2018-05-01
This paper deals with micro milling cutting process focusing the attention on tool run-out measurement. In fact, among the effects of the scale reduction from macro to micro (i.e., size effects) tool run-out plays an important role. This research is aimed at developing an easy and reliable method to measure tool run-out in micro milling based on experimental tests and an analytical model. From an Industry 4.0 perspective this measuring strategy can be integrated into an adaptive system for controlling cutting forces, with the objective of improving the production quality, the process stability, reducing at the same time the tool wear and the machining costs. The proposed procedure estimates the tool run-out parameters from the tool diameter, the channel width, and the phase angle between the cutting edges. The cutting edge phase measurement is based on the force signal analysis. The developed procedure has been tested on data coming from micro milling experimental tests performed on a Ti6Al4V sample. The results showed that the developed procedure can be successfully used for tool run-out estimation.
Advances in Projection Moire Interferometry Development for Large Wind Tunnel Applications
NASA Technical Reports Server (NTRS)
Fleming, Gary A.; Soto, Hector L.; South, Bruce W.; Bartram, Scott M.
1999-01-01
An instrument development program aimed at using Projection Moire Interferometry (PMI) for acquiring model deformation measurements in large wind tunnels was begun at NASA Langley Research Center in 1996. Various improvements to the initial prototype PMI systems have been made throughout this development effort. This paper documents several of the most significant improvements to the optical hardware and image processing software, and addresses system implementation issues for large wind tunnel applications. The improvements have increased both measurement accuracy and instrument efficiency, promoting the routine use of PMI for model deformation measurements in production wind tunnel tests.
Integrated Research Methods for Applied Urban Hydrogeology of Karst Sites
NASA Astrophysics Data System (ADS)
Epting, J.; Romanov, D. K.; Kaufmann, G.; Huggenberger, P.
2008-12-01
Integrated and adaptive surface- and groundwater monitoring and management in urban areas require innovative process-oriented approaches. To accomplish this, it is necessary to develop and combine interdisciplinary instruments that facilitate adequately quantifying cumulative effects on groundwater flow regimes. While the characterization and modeling of flow in heterogeneous and fractured media has been investigated intensively, there are no well-developed long-term hydrogeological research sites for gypsum karst. Considering that infrastructures in karst regions, particularly in gypsum, are prone to subsidence, severe problems can arise in urban areas. In the 1880's, a river dam was constructed on gypsum-containing rock, Southeast of Basel, Switzerland. Over the last 30 years, subsidence of the dam and an adjacent highway has been observed. Surface water infiltrates upstream of the dam, circulates in the gravel deposits and in the weathered bedrock around and beneath the dam and exfiltrates downstream into the river. These processes enhance karstification processes in the soluble units of the gypsum. As a result an extended weathering zone within the bedrock and the development of preferential flow paths within voids and conduits can be observed. To prevent further subsidence, construction measures were conducted in two major project phases in 2006 and 2007. The highway was supported by a large number of pillars embedded in the non- weathered rock and by a sealing pile wall, to prevent infiltrating river water circulating around the dam and beneath the foundation of the highway. To safeguard surface and subsurface water resources during the construction measures, an extensive observation network was set up. Protection schemes and geotechnical investigations that are necessary for engineering projects often provide "windows of opportunity", bearing the possibility to change perceptions concerning the sustainable development of water resources and coordinate future measures. Theories describing the evolution of karst systems are mainly based on conceptual models. Although these models are based on fundamental and well established physical and chemical principles that allow studying important processes from initial small scale fracture networks to the mature karst, systems for monitoring the evolution of karst phenomena are rare. Integrated process-oriented investigation methods are presented, comprising the combination of multiple data sources (lithostratigraphic information of boreholes, extensive groundwater monitoring, dye tracer tests, geophysics) with high-resolution numerical groundwater modeling and model simulations of karstification below the dam. Subsequently, different scenarios evaluated the future development of the groundwater flow regime, the karstification processes as well as possible remediation measures. The approach presented assists in optimizing investigation methods, including measurement and monitoring technologies with predictive character for similar subsidence problems within karst environments in urban areas.
Adair, Carol E; Simpson, Elizabeth; Casebeer, Ann L; Birdsell, Judith M; Hayden, Katharine A; Lewis, Steven
2006-07-01
This paper summarizes findings of a comprehensive, systematic review of the peer-reviewed and grey literature on performance measurement according to each stage of the performance measurement process--conceptualization, selection and development, data collection, and reporting and use. It also outlines implications for practice. Six hundred sixty-four articles about organizational performance measurement from the health and business literature were reviewed after systematic searches of the literature, multi-rater relevancy ratings, citation checks and expert author nominations. Key themes were extracted and summarized from the most highly rated papers for each performance measurement stage. Despite a virtually universal consensus on the potential benefits of performance measurement, little evidence currently exists to guide practice in healthcare. Issues in conceptualizing systems include strategic alignment and scope. There are debates on the criteria for selecting measures and on the types and quality of measures. Implementation of data collection and analysis systems is complex and costly, and challenges persist in reporting results, preventing unintended effects and putting findings for improvement into action. There is a need for further development and refinement of performance measures and measurement systems, with a particular focus on strategies to ensure that performance measurement leads to healthcare improvement.
Automated aerial image based CD metrology initiated by pattern marking with photomask layout data
NASA Astrophysics Data System (ADS)
Davis, Grant; Choi, Sun Young; Jung, Eui Hee; Seyfarth, Arne; van Doornmalen, Hans; Poortinga, Eric
2007-05-01
The photomask is a critical element in the lithographic image transfer process from the drawn layout to the final structures on the wafer. The non-linearity of the imaging process and the related MEEF impose a tight control requirement on the photomask critical dimensions. Critical dimensions can be measured in aerial images with hardware emulation. This is a more recent complement to the standard scanning electron microscope measurement of wafers and photomasks. Aerial image measurement includes non-linear, 3-dimensional, and materials effects on imaging that cannot be observed directly by SEM measurement of the mask. Aerial image measurement excludes the processing effects of printing and etching on the wafer. This presents a unique contribution to the difficult process control and modeling tasks in mask making. In the past, aerial image measurements have been used mainly to characterize the printability of mask repair sites. Development of photomask CD characterization with the AIMS TM tool was motivated by the benefit of MEEF sensitivity and the shorter feedback loop compared to wafer exposures. This paper describes a new application that includes: an improved interface for the selection of meaningful locations using the photomask and design layout data with the Calibre TM Metrology Interface, an automated recipe generation process, an automated measurement process, and automated analysis and result reporting on a Carl Zeiss AIMS TM system.
The Development of Quality Measures for the Performance and Interpretation of Esophageal Manometry
Yadlapati, Rena; Gawron, Andrew J.; Keswani, Rajesh N.; Bilimoria, Karl; Castell, Donald O.; Dunbar, Kerry B.; Gyawali, Chandra P.; Jobe, Blair A.; Katz, Philip O.; Katzka, David A.; Lacy, Brian E.; Massey, Benson T.; Richter, Joel E.; Schnoll-Sussman, Felice; Spechler, Stuart J.; Tatum, Roger; Vela, Marcelo F.; Pandolfino, John E.
2016-01-01
Background and Aims Esophageal manometry (EM) is the gold standard for the diagnosis of esophageal motility disorders. Variations in the performance and interpretation of EM result in discrepant diagnoses and unnecessary repeated procedures, and may negatively impact patient outcomes. A method to benchmark the procedural quality of EM is needed. The primary aim of this study was to develop quality measures for performing and interpreting EM. Methods The RAND/University of California, Los Angeles Appropriateness Methodology (RAM) was utilized. Fifteen experts in esophageal manometry were invited to be a part of the panel. Potential quality measures were identified through a literature search and interviews with experts. The expert panel ranked the proposed quality measures for appropriateness via a two-round process on the basis of RAM. Results Fourteen experts participated in all processes. A total of 29 measures were considered; 17 of these measures were ranked as appropriate and related to competency (2), pre-procedure (2), procedure (3) and interpretation (10). The latter 10 were integrated into a single composite measure. Thus, 8 final measures were determined to be appropriate quality measures for EM. Five strong recommendations were also endorsed by the experts, however they were not ranked as appropriate quality measures. Conclusions Eight formally validated quality measures for the performance and interpretation of EM were developed on the basis of RAM. These measures represent key aspects of a high-quality EM study and should be uniformly adopted. Evaluation of these measures in clinical practice is needed to assess their impact on outcomes. PMID:26499925
NASA Astrophysics Data System (ADS)
Masi, Matteo; Ferdos, Farzad; Losito, Gabriella; Solari, Luca
2016-04-01
Electrical Impedance Tomography (EIT) is a technique for the imaging of the electrical properties of conductive materials. In EIT, the spatial distribution of the electrical resistivity or electrical conductivity within a domain is reconstructed using measurements made with electrodes placed at the boundaries of the domain. Data acquisition is typically made by applying an electrical current to the object under investigation using a set of electrodes, and measuring the developed voltage between the other electrodes. The tomographic image is then obtained using an inversion algorithm. This work describes the implementation of a simple and low cost 3D EIT measurement system suitable for laboratory-scale studies. The system was specifically developed for the time-lapse imaging of soil samples subjected to erosion processes during laboratory tests. The tests reproduce the process of internal erosion of soil particles by water flow within a granular media; this process is one of the most common causes of failure of earthen levees and embankment dams. The measurements needed strict requirements of speed and accuracy due to the varying time scale and magnitude of these processes. The developed EIT system consists of a PC which controls I/O cards (multiplexers) through the Arduino micro-controller, an external current generator, a digital acquisition device (DAQ), a power supply and the electrodes. The ease of programming of the Arduino interface greatly helped the implementation of custom acquisition software, increasing the overall flexibility of the system and the creation of specific acquisition schemes and configurations. The system works with a multi-electrode configuration of up to 48 channels but it was designed to be upgraded to an arbitrary large number of electrodes by connecting additional multiplexer cards (> 96 electrodes). The acquisition was optimized for multi-channel measurements so that the overall time of acquisition is dramatically reduced compared to the single channel instrumentation. The accuracy and operation were tested under different conditions. The results from preliminary tests show that the system is able to clearly identify objects discriminated by different resistivity. Furthermore, measurements carried out during internal erosion simulations demonstrate that even small variations in the electrical resistivity can be captured and these changes can be related to the erosion processes.
Development of a thermal acoustical aircraft insulation material
NASA Technical Reports Server (NTRS)
Lin, R. Y.; Struzik, E. A.
1974-01-01
A process was developed for fabricating a light weight foam suitable for thermal and acoustical insulation in aircraft. The procedures and apparatus are discussed, and the foam specimens are characterized by numerous tests and measurements.
Off-Body Boundary-Layer Measurement Techniques Development for Supersonic Low-Disturbance Flows
NASA Technical Reports Server (NTRS)
Owens, Lewis R.; Kegerise, Michael A.; Wilkinson, Stephen P.
2011-01-01
Investigations were performed to develop accurate boundary-layer measurement techniques in a Mach 3.5 laminar boundary layer on a 7 half-angle cone at 0 angle of attack. A discussion of the measurement challenges is presented as well as how each was addressed. A computational study was performed to minimize the probe aerodynamic interference effects resulting in improved pitot and hot-wire probe designs. Probe calibration and positioning processes were also developed with the goal of reducing the measurement uncertainties from 10% levels to less than 5% levels. Efforts were made to define the experimental boundary conditions for the cone flow so comparisons could be made with a set of companion computational simulations. The development status of the mean and dynamic boundary-layer flow measurements for a nominally sharp cone in a low-disturbance supersonic flow is presented.
Measuring Down: Evaluating Digital Storytelling as a Process for Narrative Health Promotion.
Gubrium, Aline C; Fiddian-Green, Alice; Lowe, Sarah; DiFulvio, Gloria; Del Toro-Mejías, Lizbeth
2016-05-15
Digital storytelling (DST) engages participants in a group-based process to create and share narrative accounts of life events. We present key evaluation findings of a 2-year, mixed-methods study that focused on effects of participating in the DST process on young Puerto Rican Latina's self-esteem, social support, empowerment, and sexual attitudes and behaviors. Quantitative results did not show significant changes in the expected outcomes. However, in our qualitative findings we identified several ways in which the DST made positive, health-bearing effects. We argue for the importance of "measuring down" to reflect the locally grounded, felt experiences of participants who engage in the process, as current quantitative scales do not "measure up" to accurately capture these effects. We end by suggesting the need to develop mixed-methods, culturally relevant, and sensitive evaluation tools that prioritize process effects as they inform intervention and health promotion. © The Author(s) 2016.
Development of emission factors for polycarbonate processing.
Rhodes, Verne L; Kriek, George; Lazear, Nelson; Kasakevich, Jean; Martinko, Marie; Heggs, R P; Holdren, M W; Wisbith, A S; Keigley, G W; Williams, J D; Chuang, J C; Satola, J R
2002-07-01
Emission factors for selected volatile organic compounds (VOCs) and particulate emissions were developed while processing eight commercial grades of polycarbonate (PC) and one grade of a PC/acrylonitrile-butadiene-styrene (ABS) blend. A small commercial-type extruder was used, and the extrusion temperature was held constant at 304 degrees C. An emission factor was calculated for each substance measured and is reported as pounds released to the atmosphere/million pounds of polymer resin processed [ppm (wt/wt)]. Scaled to production volumes, these emission factors can be used by processors to estimate emission quantities from similar PC processing operations.
From mission to measures: performance measure development for a Teen Pregnancy Prevention Program.
Farb, Amy Feldman; Burrus, Barri; Wallace, Ina F; Wilson, Ellen K; Peele, John E
2014-03-01
The Office of Adolescent Health (OAH) sought to create a comprehensive set of performance measures to capture the performance of the Teen Pregnancy Prevention (TPP) program. This performance measurement system needed to provide measures that could be used internally (by both OAH and the TPP grantees) for management and program improvement as well as externally to communicate the program's progress to other interested stakeholders and Congress. This article describes the selected measures and outlines the considerations behind the TPP measurement development process. Issues faced, challenges encountered, and lessons learned have broad applicability for other federal agencies and, specifically, for TPP programs interested in assessing their own performance and progress. Published by Elsevier Inc.
A Nursing Process Methodology.
ERIC Educational Resources Information Center
Ryan-Wenger, Nancy M.
1990-01-01
A nursing methodology developed by the faculty at The Ohio State University teaches nursing students problem-solving techniques applicable to any nursing situation. It also provides faculty and students with a basis for measuring students' progress and ability in applying the nursing process. (Author)
Methane’s Role in Promoting Sustainable Development in the Oil and Natural Gas Industry
The document summarizes a number of established methods to identify, measure and reduce methane emissions from a variety of equipment and processes in oil and gas production and natural gas processing and transmission facilities.
To develop a spectral analyzer for physiological and medical use
NASA Technical Reports Server (NTRS)
Iberall, A.; Cardon, S.; Weinberg, M.; Schindler, A.
1971-01-01
Scientific requirements necessary to develop a spectral analyzer for monitoring mammalian subjects, are discussed. The analyzer measures dynamic or time dependent data as a measure of the subjects operating status. Measurable data include metabolic rate, body temperature, and blood constituents like glucose, oxygen, and carbon dioxide, and lactic acid. Metabolic cycles were found with periodicities in the range of minutes and hours; longer cycles in body weight (3 1/2 days and 60 days), indicative of metabolic processes, were also found.
Measuring ground movement in geothermal areas of Imperial Valley, California
NASA Technical Reports Server (NTRS)
Lofgren, B. E.
1974-01-01
Significant ground movement may accompany the extraction of large quantities of fluids from the subsurface. In Imperial Valley, California, one of the potential hazards of geothermal development is the threat of both subsidence and horizontal movement of the land surface. Regional and local survey nets are being monitored to detect and measure possible ground movement caused by future geothermal developments. Precise measurement of surface and subsurface changes will be required to differentiate man-induced changes from natural processes in this tectonically active region.
Improved determination of vector lithospheric magnetic anomalies from MAGSAT data
NASA Technical Reports Server (NTRS)
Ravat, Dhananjay
1993-01-01
Scientific contributions made in developing new methods to isolate and map vector magnetic anomalies from measurements made by Magsat are described. In addition to the objective of the proposal, the isolation and mapping of equatorial vector lithospheric Magsat anomalies, isolation of polar ionospheric fields during the period were also studied. Significant progress was also made in isolation of polar delta(Z) component and scalar anomalies as well as integration and synthesis of various techniques of removing equatorial and polar ionospheric effects. The significant contributions of this research are: (1) development of empirical/analytical techniques in modeling ionospheric fields in Magsat data and their removal from uncorrected anomalies to obtain better estimates of lithospheric anomalies (this task was accomplished for equatorial delta(X), delta(Z), and delta(B) component and polar delta(Z) and delta(B) component measurements; (2) integration of important processing techniques developed during the last decade with the newly developed technologies of ionospheric field modeling into an optimum processing scheme; and (3) implementation of the above processing scheme to map the most robust magnetic anomalies of the lithosphere (components as well as scalar).
NASA Technical Reports Server (NTRS)
Britt, C. L., Jr.
1975-01-01
The development of an RF Multilateration system to provide accurate position and velocity measurements during the approach and landing phase of Vertical Takeoff Aircraft operation is discussed. The system uses an angle-modulated ranging signal to provide both range and range rate measurements between an aircraft transponder and multiple ground stations. Range and range rate measurements are converted to coordinate measurements and the coordinate and coordinate rate information is transmitted by an integral data link to the aircraft. Data processing techniques are analyzed to show advantages and disadvantages. Error analyses are provided to permit a comparison of the various techniques.
Makanza, R; Zaman-Allah, M; Cairns, J E; Eyre, J; Burgueño, J; Pacheco, Ángela; Diepenbrock, C; Magorokosho, C; Tarekegne, A; Olsen, M; Prasanna, B M
2018-01-01
Grain yield, ear and kernel attributes can assist to understand the performance of maize plant under different environmental conditions and can be used in the variety development process to address farmer's preferences. These parameters are however still laborious and expensive to measure. A low-cost ear digital imaging method was developed that provides estimates of ear and kernel attributes i.e., ear number and size, kernel number and size as well as kernel weight from photos of ears harvested from field trial plots. The image processing method uses a script that runs in a batch mode on ImageJ; an open source software. Kernel weight was estimated using the total kernel number derived from the number of kernels visible on the image and the average kernel size. Data showed a good agreement in terms of accuracy and precision between ground truth measurements and data generated through image processing. Broad-sense heritability of the estimated parameters was in the range or higher than that for measured grain weight. Limitation of the method for kernel weight estimation is discussed. The method developed in this work provides an opportunity to significantly reduce the cost of selection in the breeding process, especially for resource constrained crop improvement programs and can be used to learn more about the genetic bases of grain yield determinants.
Developing an evaluation framework for clinical redesign programs: lessons learnt.
Samaranayake, Premaratne; Dadich, Ann; Fitzgerald, Anneke; Zeitz, Kathryn
2016-09-19
Purpose The purpose of this paper is to present lessons learnt through the development of an evaluation framework for a clinical redesign programme - the aim of which was to improve the patient journey through improved discharge practices within an Australian public hospital. Design/methodology/approach The development of the evaluation framework involved three stages - namely, the analysis of secondary data relating to the discharge planning pathway; the analysis of primary data including field-notes and interview transcripts on hospital processes; and the triangulation of these data sets to devise the framework. The evaluation framework ensured that resource use, process management, patient satisfaction, and staff well-being and productivity were each connected with measures, targets, and the aim of clinical redesign programme. Findings The application of business process management and a balanced scorecard enabled a different way of framing the evaluation, ensuring measurable outcomes were connected to inputs and outputs. Lessons learnt include: first, the importance of mixed-methods research to devise the framework and evaluate the redesigned processes; second, the need for appropriate tools and resources to adequately capture change across the different domains of the redesign programme; and third, the value of developing and applying an evaluative framework progressively. Research limitations/implications The evaluation framework is limited by its retrospective application to a clinical process redesign programme. Originality/value This research supports benchmarking with national and international practices in relation to best practice healthcare redesign processes. Additionally, it provides a theoretical contribution on evaluating health services improvement and redesign initiatives.
Software Engineering Laboratory Ada performance study: Results and implications
NASA Technical Reports Server (NTRS)
Booth, Eric W.; Stark, Michael E.
1992-01-01
The SEL is an organization sponsored by NASA/GSFC to investigate the effectiveness of software engineering technologies applied to the development of applications software. The SEL was created in 1977 and has three organizational members: NASA/GSFC, Systems Development Branch; The University of Maryland, Computer Sciences Department; and Computer Sciences Corporation, Systems Development Operation. The goals of the SEL are as follows: (1) to understand the software development process in the GSFC environments; (2) to measure the effect of various methodologies, tools, and models on this process; and (3) to identify and then to apply successful development practices. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that include the Ada Performance Study Report. This paper describes the background of Ada in the Flight Dynamics Division (FDD), the objectives and scope of the Ada Performance Study, the measurement approach used, the performance tests performed, the major test results, and the implications for future FDD Ada development efforts.
Meyer, Markus; Donsa, Klaus; Truskaller, Thomas; Frohner, Matthias; Pohn, Birgit; Felfernig, Alexander; Sinner, Frank; Pieber, Thomas
2018-01-01
A fast and accurate data transmission from glucose meter to clinical decision support systems (CDSSs) is crucial for the management of type 2 diabetes mellitus since almost all therapeutic interventions are derived from glucose measurements. Aim was to develop a prototype of an automated glucose measurement transmission protocol based on the Continua Design Guidelines and to embed the protocol into a CDSS used by healthcare professionals. A literature and market research was performed to analyze the state-of-the-art and thereupon develop, integrate and validate an automated glucose measurement transmission protocol in an iterative process. Findings from literature and market research guided towards the development of a standardized glucose measurement transmission protocol using a middleware. The interface description to communicate with the glucose meter was illustrated and embedded into a CDSS. A prototype of an interoperable transmission of glucose measurements was developed and implemented in a CDSS presenting a promising way to reduce medication errors and improve user satisfaction.
Historical data recording for process computers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hale, J.C.; Sellars, H.L.
1981-11-01
Computers have been used to monitor and control chemical and refining processes for more than 15 years. During this time, there has been a steady growth in the variety and sophistication of the functions performed by these process computers. Early systems were limited to maintaining only current operating measurements, available through crude operator's consoles or noisy teletypes. The value of retaining a process history, that is, a collection of measurements over time, became apparent, and early efforts produced shift and daily summary reports. The need for improved process historians which record, retrieve and display process information has grown as processmore » computers assume larger responsibilities in plant operations. This paper describes newly developed process historian functions that have been used on several of its in-house process monitoring and control systems in Du Pont factories. 3 refs.« less
Overall design of imaging spectrometer on-board light aircraft
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhongqi, H.; Zhengkui, C.; Changhua, C.
1996-11-01
Aerial remote sensing is the earliest remote sensing technical system and has gotten rapid development in recent years. The development of aerial remote sensing was dominated by high to medium altitude platform in the past, and now it is characterized by the diversity platform including planes of high-medium-low flying altitude, helicopter, airship, remotely controlled airplane, glider, and balloon. The widely used and rapidly developed platform recently is light aircraft. Early in the close of 1970s, Beijing Research Institute of Uranium Geology began aerial photography and geophysical survey using light aircraft, and put forward the overall design scheme of light aircraftmore » imaging spectral application system (LAISAS) in 19905. LAISAS is comprised of four subsystem. They are called measuring platform, data acquiring subsystem, ground testing and data processing subsystem respectively. The principal instruments of LAISAS include measuring platform controlled by inertia gyroscope, aerial spectrometer with high spectral resolution, imaging spectrometer, 3-channel scanner, 128-channel imaging spectrometer, GPS, illuminance-meter, and devices for atmospheric parameters measuring, ground testing, data correction and processing. LAISAS has the features of integrity from data acquisition to data processing and to application; of stability which guarantees the image quality and is comprised of measuring, ground testing device, and in-door data correction system; of exemplariness of integrated the technology of GIS, GPS, and Image Processing System; of practicality which embodied LAISAS with flexibility and high ratio of performance to cost. So, it can be used in the fields of fundamental research of Remote Sensing and large-scale mapping for resource exploration, environmental monitoring, calamity prediction, and military purpose.« less
Research on motor rotational speed measurement in regenerative braking system of electric vehicle
NASA Astrophysics Data System (ADS)
Pan, Chaofeng; Chen, Liao; Chen, Long; Jiang, Haobin; Li, Zhongxing; Wang, Shaohua
2016-01-01
Rotational speed signals acquisition and processing techniques are widely used in rotational machinery. In order to realized precise and real-time control of motor drive and regenerative braking process, rotational speed measurement techniques are needed in electric vehicles. Obtaining accurate motor rotational speed signal will contribute to the regenerative braking force control steadily and realized higher energy recovery rate. This paper aims to develop a method that provides instantaneous speed information in the form of motor rotation. It addresses principles of motor rotational speed measurement in the regenerative braking systems of electric vehicle firstly. The paper then presents ideal and actual Hall position sensor signals characteristics, the relation between the motor rotational speed and the Hall position sensor signals is revealed. Finally, Hall position sensor signals conditioning and processing circuit and program for motor rotational speed measurement have been carried out based on measurement error analysis.
WAMS measurements pre-processing for detecting low-frequency oscillations in power systems
NASA Astrophysics Data System (ADS)
Kovalenko, P. Y.
2017-07-01
Processing the data received from measurement systems implies the situation when one or more registered values stand apart from the sample collection. These values are referred to as “outliers”. The processing results may be influenced significantly by the presence of those in the data sample under consideration. In order to ensure the accuracy of low-frequency oscillations detection in power systems the corresponding algorithm has been developed for the outliers detection and elimination. The algorithm is based on the concept of the irregular component of measurement signal. This component comprises measurement errors and is assumed to be Gauss-distributed random. The median filtering is employed to detect the values lying outside the range of the normally distributed measurement error on the basis of a 3σ criterion. The algorithm has been validated involving simulated signals and WAMS data as well.
Swingler, Margaret M; Perry, Nicole B; Calkins, Susan D; Bell, Martha Ann
2017-01-01
We apply a biopsychosocial conceptualization to attention development in the 1st year and examine the role of neurophysiological and social processes on the development of early attention processes. We tested whether maternal behavior measured during 2 mother-child interaction tasks when infants (N = 388) were 5 months predicted infant medial frontal (F3/F4) EEG power and observed attention behavior during an attention task at 10 months. After controlling for infant attention behavior and EEG power in the same task measured at an earlier 5-month time point, results indicated a significant direct and positive association from 5-month maternal positive affect to infant attention behavior at 10 months. However, maternal positive affect was not related to medial frontal EEG power. In contrast, 5-month maternal intrusive behavior was associated with infants' task-related EEG power change at the left frontal location, F3, at 10 months of age. The test of indirect effects from 5-month maternal intrusiveness to 10-month infant attention behavior via infants' EEG power change at F3 was significant. These findings suggest that the development of neural networks serving attention processes may be 1 mechanism through which early maternal behavior is related to infant attention development in the 1st year and that intrusive maternal behavior may have a particularly disruptive effect on this process. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Anatomic modeling using 3D printing: quality assurance and optimization.
Leng, Shuai; McGee, Kiaran; Morris, Jonathan; Alexander, Amy; Kuhlmann, Joel; Vrieze, Thomas; McCollough, Cynthia H; Matsumoto, Jane
2017-01-01
The purpose of this study is to provide a framework for the development of a quality assurance (QA) program for use in medical 3D printing applications. An interdisciplinary QA team was built with expertise from all aspects of 3D printing. A systematic QA approach was established to assess the accuracy and precision of each step during the 3D printing process, including: image data acquisition, segmentation and processing, and 3D printing and cleaning. Validation of printed models was performed by qualitative inspection and quantitative measurement. The latter was achieved by scanning the printed model with a high resolution CT scanner to obtain images of the printed model, which were registered to the original patient images and the distance between them was calculated on a point-by-point basis. A phantom-based QA process, with two QA phantoms, was also developed. The phantoms went through the same 3D printing process as that of the patient models to generate printed QA models. Physical measurement, fit tests, and image based measurements were performed to compare the printed 3D model to the original QA phantom, with its known size and shape, providing an end-to-end assessment of errors involved in the complete 3D printing process. Measured differences between the printed model and the original QA phantom ranged from -0.32 mm to 0.13 mm for the line pair pattern. For a radial-ulna patient model, the mean distance between the original data set and the scanned printed model was -0.12 mm (ranging from -0.57 to 0.34 mm), with a standard deviation of 0.17 mm. A comprehensive QA process from image acquisition to completed model has been developed. Such a program is essential to ensure the required accuracy of 3D printed models for medical applications.
Topaz, Maxim; Lai, Kenneth; Dowding, Dawn; Lei, Victor J; Zisberg, Anna; Bowles, Kathryn H; Zhou, Li
2016-12-01
Electronic health records are being increasingly used by nurses with up to 80% of the health data recorded as free text. However, only a few studies have developed nursing-relevant tools that help busy clinicians to identify information they need at the point of care. This study developed and validated one of the first automated natural language processing applications to extract wound information (wound type, pressure ulcer stage, wound size, anatomic location, and wound treatment) from free text clinical notes. First, two human annotators manually reviewed a purposeful training sample (n=360) and random test sample (n=1100) of clinical notes (including 50% discharge summaries and 50% outpatient notes), identified wound cases, and created a gold standard dataset. We then trained and tested our natural language processing system (known as MTERMS) to process the wound information. Finally, we assessed our automated approach by comparing system-generated findings against the gold standard. We also compared the prevalence of wound cases identified from free-text data with coded diagnoses in the structured data. The testing dataset included 101 notes (9.2%) with wound information. The overall system performance was good (F-measure is a compiled measure of system's accuracy=92.7%), with best results for wound treatment (F-measure=95.7%) and poorest results for wound size (F-measure=81.9%). Only 46.5% of wound notes had a structured code for a wound diagnosis. The natural language processing system achieved good performance on a subset of randomly selected discharge summaries and outpatient notes. In more than half of the wound notes, there were no coded wound diagnoses, which highlight the significance of using natural language processing to enrich clinical decision making. Our future steps will include expansion of the application's information coverage to other relevant wound factors and validation of the model with external data. Copyright © 2016 Elsevier Ltd. All rights reserved.