Coggins, L.G.; Pine, William E.; Walters, C.J.; Martell, S.J.D.
2006-01-01
We present a new model to estimate capture probabilities, survival, abundance, and recruitment using traditional Jolly-Seber capture-recapture methods within a standard fisheries virtual population analysis framework. This approach compares the numbers of marked and unmarked fish at age captured in each year of sampling with predictions based on estimated vulnerabilities and abundance in a likelihood function. Recruitment to the earliest age at which fish can be tagged is estimated by using a virtual population analysis method to back-calculate the expected numbers of unmarked fish at risk of capture. By using information from both marked and unmarked animals in a standard fisheries age structure framework, this approach is well suited to the sparse data situations common in long-term capture-recapture programs with variable sampling effort. ?? Copyright by the American Fisheries Society 2006.
Linear combination reading program for capture gamma rays
Tanner, Allan B.
1971-01-01
This program computes a weighting function, Qj, which gives a scalar output value of unity when applied to the spectrum of a desired element and a minimum value (considering statistics) when applied to spectra of materials not containing the desired element. Intermediate values are obtained for materials containing the desired element, in proportion to the amount of the element they contain. The program is written in the BASIC language in a format specific to the Hewlett-Packard 2000A Time-Sharing System, and is an adaptation of an earlier program for linear combination reading for X-ray fluorescence analysis (Tanner and Brinkerhoff, 1971). Following the program is a sample run from a study of the application of the linear combination technique to capture-gamma-ray analysis for calcium (report in preparation).
Constant-parameter capture-recapture models
Brownie, C.; Hines, J.E.; Nichols, J.D.
1986-01-01
Jolly (1982, Biometrics 38, 301-321) presented modifications of the Jolly-Seber model for capture-recapture data, which assume constant survival and/or capture rates. Where appropriate, because of the reduced number of parameters, these models lead to more efficient estimators than the Jolly-Seber model. The tests to compare models given by Jolly do not make complete use of the data, and we present here the appropriate modifications, and also indicate how to carry out goodness-of-fit tests which utilize individual capture history information. We also describe analogous models for the case where young and adult animals are tagged. The availability of computer programs to perform the analysis is noted, and examples are given using output from these programs.
Final Scientific/Technical Report Carbon Capture and Storage Training Northwest - CCSTNW
DOE Office of Scientific and Technical Information (OSTI.GOV)
Workman, James
This report details the activities of the Carbon Capture and Storage Training Northwest (CCSTNW) program 2009 to 2013. The CCSTNW created, implemented, and provided Carbon Capture and Storage (CCS) training over the period of the program. With the assistance of an expert advisory board, CCSTNW created curriculum and conducted three short courses, more than three lectures, two symposiums, and a final conference. The program was conducted in five phases; 1) organization, gap analysis, and form advisory board; 2) develop list serves, website, and tech alerts; 3) training needs survey; 4) conduct lectures, courses, symposiums, and a conference; 5) evaluation surveysmore » and course evaluations. This program was conducted jointly by Environmental Outreach and Stewardship Alliance (dba. Northwest Environmental Training Center – NWETC) and Pacific Northwest National Laboratories (PNNL).« less
CZAEM USER'S GUIDE: MODELING CAPTURE ZONES OF GROUND-WATER WELLS USING ANALYTIC ELEMENTS
The computer program CZAEM is designed for elementary capture zone analysis, and is based on the analytic element method. CZAEM is applicable to confined and/or unconfined low in shallow aquifers; the Dupuit-Forchheimer assumption is adopted. CZAEM supports the following analyt...
Tacit Knowledge Capture and the Brain-Drain at Electrical Utilities
NASA Astrophysics Data System (ADS)
Perjanik, Nicholas Steven
As a consequence of an aging workforce, electric utilities are at risk of losing their most experienced and knowledgeable electrical engineers. In this research, the problem was a lack of understanding of what electric utilities were doing to capture the tacit knowledge or know-how of these engineers. The purpose of this qualitative research study was to explore the tacit knowledge capture strategies currently used in the industry by conducting a case study of 7 U.S. electrical utilities that have demonstrated an industry commitment to improving operational standards. The research question addressed the implemented strategies to capture the tacit knowledge of retiring electrical engineers and technical personnel. The research methodology involved a qualitative embedded case study. The theories used in this study included knowledge creation theory, resource-based theory, and organizational learning theory. Data were collected through one time interviews of a senior electrical engineer or technician within each utility and a workforce planning or training professional within 2 of the 7 utilities. The analysis included the use of triangulation and content analysis strategies. Ten tacit knowledge capture strategies were identified: (a) formal and informal on-boarding mentorship and apprenticeship programs, (b) formal and informal off-boarding mentorship programs, (c) formal and informal training programs, (d) using lessons learned during training sessions, (e) communities of practice, (f) technology enabled tools, (g) storytelling, (h) exit interviews, (i) rehiring of retirees as consultants, and (j) knowledge risk assessments. This research contributes to social change by offering strategies to capture the know-how needed to ensure operational continuity in the delivery of safe, reliable, and sustainable power.
Computer program for assessing the theoretical performance of a three dimensional inlet
NASA Technical Reports Server (NTRS)
Agnone, A. M.; Kung, F.
1972-01-01
A computer program for determining the theoretical performance of a three dimensional inlet is presented. An analysis for determining the capture area, ram force, spillage force, and surface pressure force is presented, along with the necessary computer program. A sample calculation is also included.
NASA Technical Reports Server (NTRS)
Benyo, Theresa L.
2002-01-01
Integration of a supersonic inlet simulation with a computer aided design (CAD) system is demonstrated. The integration is performed using the Project Integration Architecture (PIA). PIA provides a common environment for wrapping many types of applications. Accessing geometry data from CAD files is accomplished by incorporating appropriate function calls from the Computational Analysis Programming Interface (CAPRI). CAPRI is a CAD vendor neutral programming interface that aids in acquiring geometry data directly from CAD files. The benefits of wrapping a supersonic inlet simulation into PIA using CAPRI are; direct access of geometry data, accurate capture of geometry data, automatic conversion of data units, CAD vendor neutral operation, and on-line interactive history capture. This paper describes the PIA and the CAPRI wrapper and details the supersonic inlet simulation demonstration.
NASA Technical Reports Server (NTRS)
Cirillo, William M.; Earle, Kevin D.; Goodliff, Kandyce E.; Reeves, J. D.; Stromgren, Chel; Andraschko, Mark R.; Merrill, R. Gabe
2008-01-01
NASA s Constellation Program employs a strategic analysis methodology in providing an integrated analysis capability of Lunar exploration scenarios and to support strategic decision-making regarding those scenarios. The strategic analysis methodology integrates the assessment of the major contributors to strategic objective satisfaction performance, affordability, and risk and captures the linkages and feedbacks between all three components. Strategic analysis supports strategic decision making by senior management through comparable analysis of alternative strategies, provision of a consistent set of high level value metrics, and the enabling of cost-benefit analysis. The tools developed to implement the strategic analysis methodology are not element design and sizing tools. Rather, these models evaluate strategic performance using predefined elements, imported into a library from expert-driven design/sizing tools or expert analysis. Specific components of the strategic analysis tool set include scenario definition, requirements generation, mission manifesting, scenario lifecycle costing, crew time analysis, objective satisfaction benefit, risk analysis, and probabilistic evaluation. Results from all components of strategic analysis are evaluated a set of pre-defined figures of merit (FOMs). These FOMs capture the high-level strategic characteristics of all scenarios and facilitate direct comparison of options. The strategic analysis methodology that is described in this paper has previously been applied to the Space Shuttle and International Space Station Programs and is now being used to support the development of the baseline Constellation Program lunar architecture. This paper will present an overview of the strategic analysis methodology and will present sample results from the application of the strategic analysis methodology to the Constellation Program lunar architecture.
Military Police Operations and Counterinsurgency
2008-03-01
that deceitfully advertised PRUs as assassination squads. In reality most PRU operations ended in the successful capture and prosecution of VCI...and neutralize Kimathi whose capture " virtually ended Mau Mau resistence" (Asprey, 886). This campaign analysis will use the following categories to...for special access programs. , 3.10. Employs specialized investigative techniques to include forensic and t:>ehavioral sciencell and hypnosis . 3.11
Ripple Effect Mapping: A "Radiant" Way to Capture Program Impacts
ERIC Educational Resources Information Center
Kollock, Debra Hansen; Flage, Lynette; Chazdon, Scott; Paine, Nathan; Higgins, Lorie
2012-01-01
Learn more about a promising follow-up, participatory group process designed to document the results of Extension educational efforts within complex, real-life settings. The method, known as Ripple Effect Mapping, uses elements of Appreciative Inquiry, mind mapping, and qualitative data analysis to engage program participants and other community…
Passive fishing techniques: a cause of turtle mortality in the Mississippi River
Barko, V.A.; Briggler, J.T.; Ostendorf, D.E.
2004-01-01
We investigated variation of incidentally captured turtle mortality in response to environmental factors and passive fishing techniques. We used Long Term Resource Monitoring Program (LTRMP) data collected from 1996 to 2001 in the unimpounded upper Mississippi River (UMR) adjacent to Missouri and Illinois, USA. We used a principle components analysis (PCA) and a stepwise discriminant function analysis to identify factors correlated with mortality of captured turtles. Furthermore, we were interested in what percentage of turtles died from passive fishing techniques and what techniques caused the most turtle mortality. The main factors influencing captured turtle mortality were water temperature and depth at net deployment. Fyke nets captured the most turtles and caused the most turtle mortality. Almost 90% of mortalities occurred in offshore aquatic areas (i.e., side channel or tributary). Our results provide information on causes of turtle mortality (as bycatch) in a riverine system and implications for river turtle conservation by suggesting management strategies to reduce turtle bycatch and decrease mortality of captured turtles.
Software for Real-Time Analysis of Subsonic Test Shot Accuracy
2014-03-01
used the C++ programming language, the Open Source Computer Vision ( OpenCV ®) software library, and Microsoft Windows® Application Programming...video for comparison through OpenCV image analysis tools. Based on the comparison, the software then computed the coordinates of each shot relative to...DWB researchers wanted to use the Open Source Computer Vision ( OpenCV ) software library for capturing and analyzing frames of video. OpenCV contains
U.S. Spacesuit Knowledge Capture Status and Initiatives
NASA Technical Reports Server (NTRS)
Chullen, Cinda; Woods, Ron; Jairala, Juniper; Bitterly, Rose; McMann, Joe; Lewis, Cathleen
2011-01-01
The National Aeronautics and Space Administration (NASA), other organizations and individuals have been performing United States (U.S.) spacesuit knowledge capture since the beginning of space exploration via publication of reports, conference presentations, specialized seminars, and classes instructed by veterans in the field. Recently, the effort has been more concentrated and formalized whereby a new avenue of spacesuit knowledge capture has been added to the archives through which videotaping occurs, engaging both current and retired specialists in the field presenting technical scope specifically for education and preservation of knowledge or being interviewed to archive their significance to NASA s history. Now with video archiving, all these avenues of learning are brought to life with the real experts presenting their wealth of knowledge on screen for future learners to enjoy. U.S. spacesuit knowledge capture topics have included lessons learned in spacesuit technology, experience from the Gemini, Apollo, Skylab and Shuttle programs, hardware certification, design, development and other program components, spacesuit evolution and experience, failure analysis and resolution, aspects of program management, and personal interviews. These archives of actual spacesuit legacy now reflect its rich history and will provide a wealth of knowledge which will greatly enhance the chances for the success of future and more ambitious spacesuit system programs. In this paper, NASA s formal spacesuit knowledge capture efforts will be reviewed and a status will be provided to reveal initiatives and accomplishments since the inception of the more formal U.S. spacesuit knowledge program. A detail itemization of the actual archives will be addressed along with topics that are now available to the general NASA community and the public. Additionally, the latest developments in the archival relationship with the Smithsonian will be discussed.
U.S. Spacesuit Knowledge Capture Status and Initiatives
NASA Technical Reports Server (NTRS)
Chullen, Cinda; Woods, Ron; Jairala, Juniper; Bitterly, Rose; McMann, Joe; Lewis, Cathleen
2012-01-01
The National Aeronautics and Space Administration (NASA), other organizations and individuals have been performing United States (U.S.) spacesuit knowledge capture since the beginning of space exploration via publication of reports, conference presentations, specialized seminars, and classes instructed by veterans in the field. Recently, the effort has been more concentrated and formalized whereby a new avenue of spacesuit knowledge capture has been added to the archives through which videotaping occurs, engaging both current and retired specialists in the field presenting technical scope specifically for education and preservation of knowledge or being interviewed to archive their significance to NASA's history. Now with video archiving, all these avenues of learning are brought to life with the real experts presenting their wealth of knowledge on screen for future learners to enjoy. U.S. spacesuit knowledge capture topics have included lessons learned in spacesuit technology, experience from the Gemini, Apollo, Skylab and Shuttle programs, hardware certification, design, development and other program components, spacesuit evolution and experience, failure analysis and resolution, aspects of program management, and personal interviews. These archives of actual spacesuit legacy now reflect its rich history and will provide a wealth of knowledge which will greatly enhance the chances for the success of future and more ambitious spacesuit system programs. In this paper, NASA s formal spacesuit knowledge capture efforts will be reviewed and a status will be provided to reveal initiatives and accomplishments since the inception of the more formal U.S. spacesuit knowledge program. A detail itemization of the actual archives will be addressed along with topics that are now available to the general NASA community and the public. Additionally, the latest developments in the archival relationship with the Smithsonian will be discussed.
ERIC Educational Resources Information Center
Waters, Eric L.
2010-01-01
Asynchronous online credit recovery programs have been implemented in public schools across the United States for a variety of reasons. In this case, African American female students who are deficient in course credits towards high school graduation have taken advantage of this relatively new e-programming mechanism as a means to capture course…
US Spacesuit Knowledge Capture
NASA Technical Reports Server (NTRS)
Chullen, Cinda; Thomas, Ken; McMann, Joe; Dolan, Kristi; Bitterly, Rose; Lewis, Cathleen
2011-01-01
The ability to learn from both the mistakes and successes of the past is vital to assuring success in the future. Due to the close physical interaction between spacesuit systems and human beings as users, spacesuit technology and usage lends itself rather uniquely to the benefits realized from the skillful organization of historical information; its dissemination; the collection and identification of artifacts; and the education of those in the field. The National Aeronautics and Space Administration (NASA), other organizations and individuals have been performing United States (U.S.) Spacesuit Knowledge Capture since the beginning of space exploration. Avenues used to capture the knowledge have included publication of reports; conference presentations; specialized seminars; and classes usually given by veterans in the field. More recently the effort has been more concentrated and formalized whereby a new avenue of spacesuit knowledge capture has been added to the archives in which videotaping occurs engaging both current and retired specialists in the field presenting technical scope specifically for education and preservation of knowledge. With video archiving, all these avenues of learning can now be brought to life with the real experts presenting their wealth of knowledge on screen for future learners to enjoy. Scope and topics of U.S. spacesuit knowledge capture have included lessons learned in spacesuit technology, experience from the Gemini, Apollo, Skylab and Shuttle programs, hardware certification, design, development and other program components, spacesuit evolution and experience, failure analysis and resolution, and aspects of program management. Concurrently, U.S. spacesuit knowledge capture activities have progressed to a level where NASA, the National Air and Space Museum (NASM), Hamilton Sundstrand (HS) and the spacesuit community are now working together to provide a comprehensive closed-looped spacesuit knowledge capture system which includes
Analysis and Modeling of Ground Operations at Hub Airports
NASA Technical Reports Server (NTRS)
Atkins, Stephen (Technical Monitor); Andersson, Kari; Carr, Francis; Feron, Eric; Hall, William D.
2000-01-01
Building simple and accurate models of hub airports can considerably help one understand airport dynamics, and may provide quantitative estimates of operational airport improvements. In this paper, three models are proposed to capture the dynamics of busy hub airport operations. Two simple queuing models are introduced to capture the taxi-out and taxi-in processes. An integer programming model aimed at representing airline decision-making attempts to capture the dynamics of the aircraft turnaround process. These models can be applied for predictive purposes. They may also be used to evaluate control strategies for improving overall airport efficiency.
Development of a metrics dashboard for monitoring involvement in the 340B Drug Pricing Program.
Karralli, Rusol; Tipton, Joyce; Dumitru, Doina; Scholz, Lisa; Masilamani, Santhi
2015-09-01
An electronic tool to support hospital organizations in monitoring and addressing financial and compliance challenges related to participation in the 340B Drug Pricing Program is described. In recent years there has been heightened congressional and regulatory scrutiny of the federal 340B program, which provides discounted drug prices on Medicaid-covered drugs to safety net hospitals and other 340B-eligible healthcare organizations, or "covered entities." Historically, the 340B program has lacked a metrics-driven reporting framework to help covered entities capture the value of 340B program involvement, community benefits provided to underserved populations, and costs associated with compliance with 340B eligibility requirements. As part of an initiative by a large health system to optimize its 340B program utilization and regulatory compliance efforts, a team of pharmacists led the development of an electronic dashboard tool to help monitor 340B program activities at the system's 340B-eligible facilities. After soliciting input from an array of internal and external 340B program stakeholders, the team designed the dashboard and associated data-entry tools to facilitate the capture and analysis of 340B program-related data in four domains: cost savings and revenue, program maintenance costs, community benefits, and compliance. A large health system enhanced its ability to evaluate and monitor 340B program-related activities through the use of a dashboard tool capturing key metrics on cost savings achieved, maintenance costs, and other aspects of program involvement. Copyright © 2015 by the American Society of Health-System Pharmacists, Inc. All rights reserved.
Analysis and Perspective from the Complex Aerospace Systems Exchange (CASE) 2013
NASA Technical Reports Server (NTRS)
Jones, Kennie H.; Parker, Peter A.; Detweiler, Kurt N.; McGowan, Anna-Maria R.; Dress, David A.; Kimmel, William M.
2014-01-01
NASA Langley Research Center embedded four rapporteurs at the Complex Aerospace Systems Exchange (CASE) held in August 2013 with the objective to capture the essence of the conference presentations and discussions. CASE was established to provide a discussion forum among chief engineers, program managers, and systems engineers on challenges in the engineering of complex aerospace systems. The meeting consists of invited presentations and panels from industry, academia, and government followed by discussions among attendees. This report presents the major and reoccurring themes captured throughout the meeting and provides analysis and insights to further the CASE mission.
Fully automated corneal endothelial morphometry of images captured by clinical specular microscopy
NASA Astrophysics Data System (ADS)
Bucht, Curry; Söderberg, Per; Manneberg, Göran
2009-02-01
The corneal endothelium serves as the posterior barrier of the cornea. Factors such as clarity and refractive properties of the cornea are in direct relationship to the quality of the endothelium. The endothelial cell density is considered the most important morphological factor. Morphometry of the corneal endothelium is presently done by semi-automated analysis of pictures captured by a Clinical Specular Microscope (CSM). Because of the occasional need of operator involvement, this process can be tedious, having a negative impact on sampling size. This study was dedicated to the development of fully automated analysis of images of the corneal endothelium, captured by CSM, using Fourier analysis. Software was developed in the mathematical programming language Matlab. Pictures of the corneal endothelium, captured by CSM, were read into the analysis software. The software automatically performed digital enhancement of the images. The digitally enhanced images of the corneal endothelium were transformed, using the fast Fourier transform (FFT). Tools were developed and applied for identification and analysis of relevant characteristics of the Fourier transformed images. The data obtained from each Fourier transformed image was used to calculate the mean cell density of its corresponding corneal endothelium. The calculation was based on well known diffraction theory. Results in form of estimated cell density of the corneal endothelium were obtained, using fully automated analysis software on images captured by CSM. The cell density obtained by the fully automated analysis was compared to the cell density obtained from classical, semi-automated analysis and a relatively large correlation was found.
Generic drug discount programs: are prescriptions being submitted for pharmacy benefit adjudication?
Tungol, Alexandra; Starner, Catherine I; Gunderson, Brent W; Schafer, Jeremy A; Qiu, Yang; Gleason, Patrick P
2012-01-01
In 2006, pharmacies began offering select generic prescription drugs at discount prices (e.g., $4 for a 30-day supply) through nonmembership and membership programs. As part of the contract in membership generic drug discount programs, the member agrees to forgo submission of the claim to the insurance company. Claims not submitted for insurance adjudication may result in incomplete pharmacy benefit manager (PBM) and health plan data, which could negatively influence adherence reporting and clinical programs. To address potentially missing claims data, the Centers for Medicare Medicaid Services (CMS) encourages Medicare Part D sponsors to incentivize network pharmacies to submit claims directly to the plan for drugs dispensed outside of a member's Part D benefit, unless a member refuses. The extent of PBM and health plan claims capture loss due to generic drug discount programs is unknown. To identify changes in levothyroxine utilizers' prescription claims capture rate following the advent of generic drug discount membership and nonmembership programs. This retrospective concurrent cohort study used claims data from 3.5 million commercially insured members enrolled in health plans located in the central and southern United States with Prime Therapeutics pharmacy benefit coverage. Members were required to be 18 years or older and younger than 60 years as of January 1, 2006, and continuously enrolled from January 1, 2006, through December 31, 2010. Members utilizing generic levothyroxine for at least 120 days during January 1, 2006, through June 30, 2006 (baseline period) from the same pharmacy group with supply on July 1, 2006, were placed into 1 of 3 pharmacy groups: (1) nonmembership (Walmart, Sam's Club, Target, Kroger, City Market, and King Soopers pharmacies), (2) membership (Walgreens, CVS, Albertsons, and Savon pharmacies), or (3) the reference group of all other pharmacies. The index date was defined as July 1, 2006. The levothyroxine claim providing supply on July 1, 2006, was the index claim. Members with a Kmart pharmacy index claim were excluded, since the Kmart membership drug discount program began prior to July 1, 2006. Levothyroxine claims capture nonpersistency, defined as the occurrence of a claim supply end date prior to a 180-day gap, was the primary outcome variable and was assessed from July 1, 2006, through June 30, 2010 (follow-up period). The odds of levothyroxine claims capture nonpersistency by pharmacy group were assessed using a logistic regression analysis adjusted for the following covariates: age, gender, median income in the ZIP code of residence (binomial for ≤ $50,000 vs. greater than $50,000), switch to a brand levothyroxine product during the follow-up period, index levothyroxine claim supply of 90 days or more, and index levothyroxine claim member cost share per 30-day supply in tertiles (≤ $5.00, $5.01-$7.99, ≥ $8.00). Of 2,632,855 eligible members aged 18 years or older, 13,427 met all study eligibility criteria. The baseline pharmacy groups were membership with 3,595 (26.8%), nonmembership with 1,919 (14.3%), and all other pharmacies with 7,913 (58.9%) members. The rates of levothyroxine claims capture persistency throughout the 4-year follow-up period were 85.4% for nonmembership (P = 0.593 vs. all other pharmacies), 77.7% for the membership group (P less than 0.001 vs. all other pharmacies), and 85.9% for all other pharmacies. The Kaplan-Meier comparison of claims capture persistency found nearly identical claims capture loss for the nonmembership compared with all other pharmacies group, and when compared in a multivariate logistic regression model, there was no difference in the odds of levothyroxine claims capture over 4 years follow-up (OR = 1.01, 95% CI = 0.88-1.16, P = 0.900). The membership generic drug discount programs (Walgreens, CVS, Alberstons, and Savon pharmacies) had a statistically significant 61% higher odds (OR = 1.61, 95% CI = 1.45-1.79, P less than 0.001) of levothyroxine claims capture nonpersistency. The onset of the difference between the membership group and the all other pharmacies group was temporally associated with the launch of the membership programs. In comparison to index levothyroxine member cost of ≤ $5.00 per 30-day supply, higher cost shares were associated with higher levothyroxine claims capture nonpersistency ($5.01 to $7.99 OR 1.34, 95% CI 1.19-1.52 and ≥ $8.00 OR 1.60, 95% CI 1.40-1.82). Among levothyroxine utilizers in 2006 (prior to the advent of drug discount programs), those with claims from a pharmacy that subsequently implemented a nonmembership generic drug discount program did not appear to have a different rate of levothyroxine claims capture than members from the reference group when followed through June 2010. Utilizers with claims from a pharmacy that subsequently implemented a membership program had a significantly lower levothyroxine claims capture rate. Increasing index levothyroxine member cost was associated with higher levothyroxine claims capture loss. Because the analysis could not directly measure claims capture loss associated with members who switched to a new pharmacy group without presenting their insurance information (e.g., membership discount programs), further research is needed to confirm these findings.
Iurov, Iu B; Khazatskiĭ, I A; Akindinov, V A; Dovgilov, L V; Kobrinskiĭ, B A; Vorsanova, S G
2000-08-01
Original software FISHMet has been developed and tried for improving the efficiency of diagnosis of hereditary diseases caused by chromosome aberrations and for chromosome mapping by fluorescent in situ hybridization (FISH) method. The program allows creation and analysis of pseudocolor chromosome images and hybridization signals in the Windows 95 system, allows computer analysis and editing of the results of pseudocolor hybridization in situ, including successive imposition of initial black-and-white images created using fluorescent filters (blue, green, and red), and editing of each image individually or of a summary pseudocolor image in BMP, TIFF, and JPEG formats. Components of image computer analysis system (LOMO, Leitz Ortoplan, and Axioplan fluorescent microscopes, COHU 4910 and Sanyo VCB-3512P CCD cameras, Miro-Video, Scion LG-3 and VG-5 image capture maps, and Pentium 100 and Pentium 200 computers) and specialized software for image capture and visualization (Scion Image PC and Video-Cup) have been used with good results in the study.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-30
... DEPARTMENT OF TRANSPORTATION Dynamic Mobility Applications and Data Capture Management Programs...) Intelligent Transportation System Joint Program Office (ITS JPO) will host a free public meeting to provide stakeholders an update on the Data Capture and Management (DCM) and Dynamic Mobility Applications (DMA...
A Combined Experimental/Computational Investigation of a Rocket Based Combined Cycle Inlet
NASA Technical Reports Server (NTRS)
Smart, Michael K.; Trexler, Carl A.; Goldman, Allen L.
2001-01-01
A rocket based combined cycle inlet geometry has undergone wind tunnel testing and computational analysis with Mach 4 flow at the inlet face. Performance parameters obtained from the wind tunnel tests were the mass capture, the maximum back-pressure, and the self-starting characteristics of the inlet. The CFD analysis supplied a confirmation of the mass capture, the inlet efficiency and the details of the flowfield structure. Physical parameters varied during the test program were cowl geometry, cowl position, body-side bleed magnitude and ingested boundary layer thickness. An optimum configuration was determined for the inlet as a result of this work.
Giles, Emma Louise; Adams, Jean M
2015-01-01
Capturing public opinion toward public health topics is important to ensure that services, policy, and research are aligned with the beliefs and priorities of the general public. A number of approaches can be used to capture public opinion. We are conducting a program of work on the effectiveness and acceptability of health promoting financial incentive interventions. We have captured public opinion on financial incentive interventions using three methods: a systematic review, focus group study, and analysis of online user-generated comments to news media reports. In this short editorial-style piece, we compare and contrast our experiences with these three methods. Each of these methods had their advantages and disadvantages. Advantages include tailoring of the research question for systematic reviews, probing of answers during focus groups, and the ability to aggregate a large data set using online user-generated content. However, disadvantages include needing to update systematic reviews, participants conforming to a dominant perspective in focus groups, and being unable to collect respondent characteristics during analysis of user-generated online content. That said, analysis of user-generated online content offers additional time and resource advantages, and we found it elicited similar findings to those obtained via more traditional methods, such as systematic reviews and focus groups. A number of methods for capturing public opinions on public health topics are available. Public health researchers, policy makers, and practitioners should choose methods appropriate to their aims. Analysis of user-generated online content, especially in the context of news media reports, may be a quicker and cheaper alternative to more traditional methods, without compromising on the breadth of opinions captured.
NASA Technical Reports Server (NTRS)
Hoerz, F. (Editor)
1986-01-01
Summaries of papers presented at the Workshop on Micrometeorite Capture Experiments are compiled. The goals of the workshop were to define the scientific objectives and the resulting performance requirements of a potential Space Station facility and to identify the major elements of a coherent development program that would generate the desired capabilities within the next decade. Specific topics include cosmic dust and space debris collection techniques, particle trajectory and source determination, and specimen analysis methods.
U.S. Spacesuit Knowledge Capture
NASA Technical Reports Server (NTRS)
Chullen, Cinda; Thomas, Ken; McMann, Joe; Dolan, Kristi; Bitterly, Rose; Lewis, Cathleen
2010-01-01
The ability to learn from both the mistakes and successes of the past is vital to assuring success in the future. Due to the close physical interaction between spacesuit systems and human beings as users, spacesuit technology and usage lends itself rather uniquely to the benefits realized from the skillful organization of historical information; its dissemination; the collection and identification of artifacts; and the education of individuals and groups working in the field. The National Aeronautics and Space Administration (NASA), other organizations and individuals have been performing United States (U.S.) spacesuit knowledge capture since the beginning of space exploration. Avenues used to capture the knowledge have included publication of reports; conference presentations; specialized seminars; and classes usually given by veterans in the field. Recently, the effort has been more concentrated and formalized whereby a new avenue of spacesuit knowledge capture has been added to the archives through which videotaping occurs, engaging both current and retired specialists in the field presenting technical scope specifically for education and preservation of knowledge. Now with video archiving, all these avenues of learning can be brought to life with the real experts presenting their wealth of knowledge on screen for future learners to enjoy. U.S. spacesuit knowledge capture topics have included lessons learned in spacesuit technology, experience from the Gemini, Apollo, Skylab and Shuttle programs, hardware certification, design, development and other program components, spacesuit evolution and experience, failure analysis and resolution, and aspects of program management. Concurrently, U.S. spacesuit knowledge capture activities have progressed to a level where NASA, the National Air and Space Museum (NASM), Hamilton Sundstrand (HS) and the spacesuit community are now working together to provide a rather closed-looped spacesuit knowledge capture system which includes specific attention to spacesuit system artifacts as well. A NASM report has recently been created that allows the cross reference of history to the artifacts and the artifacts to the history including spacesuit manufacturing details with current condition and location. NASA has examined spacesuits in the NASM collection for evidence of wear during their operational life. NASA s formal spacesuit knowledge capture efforts now make use of both the NASM spacesuit preservation collection and report to enhance its efforts to educate NASA personnel and contribute to spacesuit history. Be it archiving of human knowledge or archiving of the actual spacesuit legacy hardware with its rich history, the joining together of spacesuit system artifact history with that of development and use during past programs will provide a wealth of knowledge which will greatly enhance the chances for the success of future and more ambitious spacesuit system programs.
Impact of Domain Analysis on Reuse Methods
1989-11-06
return on the investment. The potential negative effects a "bad" domain analysis has on developing systems in the domain also increases the risks of a...importance of domain analysis as part of a software reuse program. A particular goal is to assist in avoiding the potential negative effects of ad hoc or...are specification objects discovered by performing object-oriented analysis. Object-based analysis approaches thus serve to capture a model of reality
Digital PIV (DPIV) Software Analysis System
NASA Technical Reports Server (NTRS)
Blackshire, James L.
1997-01-01
A software package was developed to provide a Digital PIV (DPIV) capability for NASA LaRC. The system provides an automated image capture, test correlation, and autocorrelation analysis capability for the Kodak Megaplus 1.4 digital camera system for PIV measurements. The package includes three separate programs that, when used together with the PIV data validation algorithm, constitutes a complete DPIV analysis capability. The programs are run on an IBM PC/AT host computer running either Microsoft Windows 3.1 or Windows 95 using a 'quickwin' format that allows simple user interface and output capabilities to the windows environment.
Product Lifecycle Management Architecture: A Model Based Systems Engineering Analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Noonan, Nicholas James
2015-07-01
This report is an analysis of the Product Lifecycle Management (PLM) program. The analysis is centered on a need statement generated by a Nuclear Weapons (NW) customer. The need statement captured in this report creates an opportunity for the PLM to provide a robust service as a solution. Lifecycles for both the NW and PLM are analyzed using Model Based System Engineering (MBSE).
Shih, Ya-Chen Tina; Chien, Chun-Ru; Moguel, Rocio; Hernandez, Mike; Hajek, Richard A; Jones, Lovell A
2016-04-01
To assess the cost-effectiveness of implementing a patient navigation (PN) program with capitated payment for Medicare beneficiaries diagnosed with lung cancer. Cost-effectiveness analysis. A Markov model to capture the disease progression of lung cancer and characterize clinical benefits of PN services as timeliness of treatment and care coordination. Taking a payer's perspective, we estimated the lifetime costs, life years (LYs), and quality-adjusted life years (QALYs) and addressed uncertainties in one-way and probabilistic sensitivity analyses. Model inputs were extracted from the literature, supplemented with data from a Centers for Medicare and Medicaid Services demonstration project. Compared to usual care, PN services incurred higher costs but also yielded better outcomes. The incremental cost and effectiveness was $9,145 and 0.47 QALYs, respectively, resulting in an incremental cost-effectiveness ratio of $19,312/QALY. One-way sensitivity analysis indicated that findings were most sensitive to a parameter capturing PN survival benefit for local-stage patients. CE-acceptability curve showed the probability that the PN program was cost-effective was 0.80 and 0.91 at a societal willingness-to-pay of $50,000 and $100,000/QALY, respectively. Instituting a capitated PN program is cost-effective for lung cancer patients in Medicare. Future research should evaluate whether the same conclusion holds in other cancers. © Health Research and Educational Trust.
Capturing Creative Program Management Best Practices
2013-04-01
Maryland Bottleneck Analysis on the DoD Pre-Milestone B Acquisition Processes Danielle Worger and Teresa Wu, Arizona State University Eugene Rex Jalao...Creative Program Management Best Practices Brandon Keller and J. Robert Wirthlin Air Force Institute of Technology The RITE Approach to Agile ...Mechanism for Adaptive Change Kathryn Aten and John T . Dillard Naval Postgraduate School A Comparative Assessment of the Navy’s Future Naval Capabilities
Corso, Phaedra S.; Ingels, Justin B.; Kogan, Steven M.; Foster, E. Michael; Chen, Yi-Fu; Brody, Gene H.
2013-01-01
Programmatic cost analyses of preventive interventions commonly have a number of methodological difficulties. To determine the mean total costs and properly characterize variability, one often has to deal with small sample sizes, skewed distributions, and especially missing data. Standard approaches for dealing with missing data such as multiple imputation may suffer from a small sample size, a lack of appropriate covariates, or too few details around the method used to handle the missing data. In this study, we estimate total programmatic costs for a prevention trial evaluating the Strong African American Families-Teen program. This intervention focuses on the prevention of substance abuse and risky sexual behavior. To account for missing data in the assessment of programmatic costs we compare multiple imputation to probabilistic sensitivity analysis. The latter approach uses collected cost data to create a distribution around each input parameter. We found that with the multiple imputation approach, the mean (95% confidence interval) incremental difference was $2149 ($397, $3901). With the probabilistic sensitivity analysis approach, the incremental difference was $2583 ($778, $4346). Although the true cost of the program is unknown, probabilistic sensitivity analysis may be a more viable alternative for capturing variability in estimates of programmatic costs when dealing with missing data, particularly with small sample sizes and the lack of strong predictor variables. Further, the larger standard errors produced by the probabilistic sensitivity analysis method may signal its ability to capture more of the variability in the data, thus better informing policymakers on the potentially true cost of the intervention. PMID:23299559
Corso, Phaedra S; Ingels, Justin B; Kogan, Steven M; Foster, E Michael; Chen, Yi-Fu; Brody, Gene H
2013-10-01
Programmatic cost analyses of preventive interventions commonly have a number of methodological difficulties. To determine the mean total costs and properly characterize variability, one often has to deal with small sample sizes, skewed distributions, and especially missing data. Standard approaches for dealing with missing data such as multiple imputation may suffer from a small sample size, a lack of appropriate covariates, or too few details around the method used to handle the missing data. In this study, we estimate total programmatic costs for a prevention trial evaluating the Strong African American Families-Teen program. This intervention focuses on the prevention of substance abuse and risky sexual behavior. To account for missing data in the assessment of programmatic costs we compare multiple imputation to probabilistic sensitivity analysis. The latter approach uses collected cost data to create a distribution around each input parameter. We found that with the multiple imputation approach, the mean (95 % confidence interval) incremental difference was $2,149 ($397, $3,901). With the probabilistic sensitivity analysis approach, the incremental difference was $2,583 ($778, $4,346). Although the true cost of the program is unknown, probabilistic sensitivity analysis may be a more viable alternative for capturing variability in estimates of programmatic costs when dealing with missing data, particularly with small sample sizes and the lack of strong predictor variables. Further, the larger standard errors produced by the probabilistic sensitivity analysis method may signal its ability to capture more of the variability in the data, thus better informing policymakers on the potentially true cost of the intervention.
Social Return on Investment: A New Approach to Understanding and Advocating for Value in Healthcare.
Laing, Catherine M; Moules, Nancy J
2017-12-01
To determine whether the methodology of social return on investment (SROI) could be a way in which the value of a healthcare-related program (children's cancer camp) could be captured, evaluated, and communicated. The value of healthcare goes beyond what can be captured in financial terms; however, this is the most common type of value that is measured. The SROI methodology accounts for a broader concept of value by measuring social, environmental, and economic outcomes and uses monetary values to represent them. The steps/stages of an SROI analysis were applied to the context of a children's camp for this article. Applying the SROI methodology to this healthcare-related program was feasible and provided insight and understanding related to the impacts of this program. Because of SROI's flexibility, it is a tool that has great potential in a healthcare environment and for leaders to evaluate programmatic return on investment.
Evaluation of Solid Sorbents as a Retrofit Technology for CO 2 Capture
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sjostrom, Sharon
2016-06-02
ADA completed a DOE-sponsored program titled Evaluation of Solid Sorbents as a Retrofit Technology for CO 2 Capture under program DE-FE0004343. During this program, sorbents were analyzed for use in a post-combustion CO 2 capture process. A supported amine sorbent was selected based upon superior performance to adsorb a greater amount of CO 2 than the activated carbon sorbents tested. When the most ideal sorbent at the time was selected, it was characterized and used to create a preliminary techno-economic analysis (TEA). A preliminary 550 MW coal-fired power plant using Illinois #6 bituminous coal was designed with a solid sorbentmore » CO 2 capture system using the selected supported amine sorbent to both facilitate the TEA and to create the necessary framework to scale down the design to a 1 MWe equivalent slipstream pilot facility. The preliminary techno-economic analysis showed promising results and potential for improved performance for CO 2 capture compared to conventional MEA systems. As a result, a 1 MWe equivalent solid sorbent system was designed, constructed, and then installed at a coal-fired power plant in Alabama. The pilot was designed to capture 90% of the CO 2 from the incoming flue gas at 1 MWe net electrical generating equivalent. Testing was not possible at the design conditions due to changes in sorbent handling characteristics at post-regenerator temperatures that were not properly incorporated into the pilot design. Thus, severe pluggage occurred at nominally 60% of the design sorbent circulation rate with heated sorbent, although no handling issues were noted when the system was operated prior to bringing the regenerator to operating temperature. Testing within the constraints of the pilot plant resulted in 90% capture of the incoming CO 2 at a flow rate equivalent of 0.2 to 0.25 MWe net electrical generating equivalent. The reduction in equivalent flow rate at 90% capture was primarily the result of sorbent circulation limitations at operating temperatures combined with pre-loading of the sorbent with CO 2 prior to entering the adsorber. Specifically, CO 2-rich gas was utilized to convey sorbent from the regenerator to the adsorber. This gas was nominally 45°C below the regenerator temperature during testing. ADA’s post-combustion capture system with modifications to overcome pilot constraints, in conjunction with incorporating a sorbent with CO 2 working capacity of 15 g CO 2/100 g sorbent and a contact time of 10 to 15 minutes or less with flue gas could provide significant cost and performance benefits when compared to an MEA system.« less
McInerney-Leo, Aideen M; Marshall, Mhairi S; Gardiner, Brooke; Coucke, Paul J; Van Laer, Lut; Loeys, Bart L; Summers, Kim M; Symoens, Sofie; West, Jennifer A; West, Malcolm J; Paul Wordsworth, B; Zankl, Andreas; Leo, Paul J; Brown, Matthew A; Duncan, Emma L
2013-01-01
Osteogenesis imperfecta (OI) and Marfan syndrome (MFS) are common Mendelian disorders. Both conditions are usually diagnosed clinically, as genetic testing is expensive due to the size and number of potentially causative genes and mutations. However, genetic testing may benefit patients, at-risk family members and individuals with borderline phenotypes, as well as improving genetic counseling and allowing critical differential diagnoses. We assessed whether whole exome sequencing (WES) is a sensitive method for mutation detection in OI and MFS. WES was performed on genomic DNA from 13 participants with OI and 10 participants with MFS who had known mutations, with exome capture followed by massive parallel sequencing of multiplexed samples. Single nucleotide polymorphisms (SNPs) and small indels were called using Genome Analysis Toolkit (GATK) and annotated with ANNOVAR. CREST, exomeCopy and exomeDepth were used for large deletion detection. Results were compared with the previous data. Specificity was calculated by screening WES data from a control population of 487 individuals for mutations in COL1A1, COL1A2 and FBN1. The target capture of five exome capture platforms was compared. All 13 mutations in the OI cohort and 9/10 in the MFS cohort were detected (sensitivity=95.6%) including non-synonymous SNPs, small indels (<10 bp), and a large UTR5/exon 1 deletion. One mutation was not detected by GATK due to strand bias. Specificity was 99.5%. Capture platforms and analysis programs differed considerably in their ability to detect mutations. Consumable costs for WES were low. WES is an efficient, sensitive, specific and cost-effective method for mutation detection in patients with OI and MFS. Careful selection of platform and analysis programs is necessary to maximize success. PMID:24501682
MatTAP: A MATLAB toolbox for the control and analysis of movement synchronisation experiments.
Elliott, Mark T; Welchman, Andrew E; Wing, Alan M
2009-02-15
Investigating movement timing and synchronisation at the sub-second range relies on an experimental setup that has high temporal fidelity, is able to deliver output cues and can capture corresponding responses. Modern, multi-tasking operating systems make this increasingly challenging when using standard PC hardware and programming languages. This paper describes a new free suite of tools (available from http://www.snipurl.com/mattap) for use within the MATLAB programming environment, compatible with Microsoft Windows and a range of data acquisition hardware. The toolbox allows flexible generation of timing cues with high temporal accuracy, the capture and automatic storage of corresponding participant responses and an integrated analysis module for the rapid processing of results. A simple graphical user interface is used to navigate the toolbox and so can be operated easily by users not familiar with programming languages. However, it is also fully extensible and customisable, allowing adaptation for individual experiments and facilitating the addition of new modules in future releases. Here we discuss the relevance of the MatTAP (MATLAB Timing Analysis Package) toolbox to current timing experiments and compare its use to alternative methods. We validate the accuracy of the analysis module through comparison to manual observation methods and replicate a previous sensorimotor synchronisation experiment to demonstrate the versatility of the toolbox features demanded by such movement synchronisation paradigms.
Dynamic analysis of Apollo-Salyut/Soyuz docking
NASA Technical Reports Server (NTRS)
Schliesing, J. A.
1972-01-01
The use of a docking-system computer program in analyzing the dynamic environment produced by two impacting spacecraft and the attitude control systems is discussed. Performance studies were conducted to determine the mechanism load and capture sensitivity to parametric changes in the initial impact conditions. As indicated by the studies, capture latching is most sensitive to vehicle angular-alinement errors and is least sensitive to lateral-miss error. As proved by load-sensitivity studies, peak loads acting on the Apollo spacecraft are considerably lower than the Apollo design-limit loads.
Knowledge Capture and Management for Space Flight Systems
NASA Technical Reports Server (NTRS)
Goodman, John L.
2005-01-01
The incorporation of knowledge capture and knowledge management strategies early in the development phase of an exploration program is necessary for safe and successful missions of human and robotic exploration vehicles over the life of a program. Following the transition from the development to the flight phase, loss of underlying theory and rationale governing design and requirements occur through a number of mechanisms. This degrades the quality of engineering work resulting in increased life cycle costs and risk to mission success and safety of flight. Due to budget constraints, concerned personnel in legacy programs often have to improvise methods for knowledge capture and management using existing, but often sub-optimal, information technology and archival resources. Application of advanced information technology to perform knowledge capture and management would be most effective if program wide requirements are defined at the beginning of a program.
Analysis of Phase-Type Stochastic Petri Nets With Discrete and Continuous Timing
NASA Technical Reports Server (NTRS)
Jones, Robert L.; Goode, Plesent W. (Technical Monitor)
2000-01-01
The Petri net formalism is useful in studying many discrete-state, discrete-event systems exhibiting concurrency, synchronization, and other complex behavior. As a bipartite graph, the net can conveniently capture salient aspects of the system. As a mathematical tool, the net can specify an analyzable state space. Indeed, one can reason about certain qualitative properties (from state occupancies) and how they arise (the sequence of events leading there). By introducing deterministic or random delays, the model is forced to sojourn in states some amount of time, giving rise to an underlying stochastic process, one that can be specified in a compact way and capable of providing quantitative, probabilistic measures. We formalize a new non-Markovian extension to the Petri net that captures both discrete and continuous timing in the same model. The approach affords efficient, stationary analysis in most cases and efficient transient analysis under certain restrictions. Moreover, this new formalism has the added benefit in modeling fidelity stemming from the simultaneous capture of discrete- and continuous-time events (as opposed to capturing only one and approximating the other). We show how the underlying stochastic process, which is non-Markovian, can be resolved into simpler Markovian problems that enjoy efficient solutions. Solution algorithms are provided that can be easily programmed.
Monte Carlo analysis of TRX lattices with ENDF/B version 3 data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hardy, J. Jr.
1975-03-01
Four TRX water-moderated lattices of slightly enriched uranium rods have been reanalyzed with consistent ENDF/B Version 3 data by means of the full-range Monte Carlo program RECAP. The following measured lattice parameters were studied: ratio of epithermal-to-thermal $sup 238$U capture, ratio of epithermal- to-thermal $sup 235$U fissions, ration of $sup 238$U captures to $sup 235$U fissions, ratio of $sup 238$U fissions to $sup 235$U fissions, and multiplication factor. In addition to the base calculations, some studies were done to find sensitivity of the TRX lattice parameters to selected variations of cross section data. Finally, additional experimental evidence is afforded bymore » effective $sup 238$U capture integrals for isolated rods. Shielded capture integrals were calculated for $sup 238$U metal and oxide rods. These are compared with other measurements. (auth)« less
Fully automated corneal endothelial morphometry of images captured by clinical specular microscopy
NASA Astrophysics Data System (ADS)
Bucht, Curry; Söderberg, Per; Manneberg, Göran
2010-02-01
The corneal endothelium serves as the posterior barrier of the cornea. Factors such as clarity and refractive properties of the cornea are in direct relationship to the quality of the endothelium. The endothelial cell density is considered the most important morphological factor of the corneal endothelium. Pathological conditions and physical trauma may threaten the endothelial cell density to such an extent that the optical property of the cornea and thus clear eyesight is threatened. Diagnosis of the corneal endothelium through morphometry is an important part of several clinical applications. Morphometry of the corneal endothelium is presently carried out by semi automated analysis of pictures captured by a Clinical Specular Microscope (CSM). Because of the occasional need of operator involvement, this process can be tedious, having a negative impact on sampling size. This study was dedicated to the development and use of fully automated analysis of a very large range of images of the corneal endothelium, captured by CSM, using Fourier analysis. Software was developed in the mathematical programming language Matlab. Pictures of the corneal endothelium, captured by CSM, were read into the analysis software. The software automatically performed digital enhancement of the images, normalizing lights and contrasts. The digitally enhanced images of the corneal endothelium were Fourier transformed, using the fast Fourier transform (FFT) and stored as new images. Tools were developed and applied for identification and analysis of relevant characteristics of the Fourier transformed images. The data obtained from each Fourier transformed image was used to calculate the mean cell density of its corresponding corneal endothelium. The calculation was based on well known diffraction theory. Results in form of estimated cell density of the corneal endothelium were obtained, using fully automated analysis software on 292 images captured by CSM. The cell density obtained by the fully automated analysis was compared to the cell density obtained from classical, semi-automated analysis and a relatively large correlation was found.
Integration and Interoperability: An Analysis to Identify the Attributes for System of Systems
2008-09-01
divisions of the enterprise. Examples of the current I2 are: • a nightly feed of elearning information is captured through an automated and...standardized process throughout the enterprise and • the LMS has been integrated with SkillSoft, a third party elearning software system, (http...Command (JITC) is responsible to test all programs that utilize standard interfaces to specific global nets or systems. Many times programs that
Activity Theory and Qualitative Research in Digital Domains
ERIC Educational Resources Information Center
Sam, Cecile
2012-01-01
Understanding the interactions between people, computer-mediated communication, and online life requires that researchers appropriate a set of methodological tools that would be best suited for capturing and analyzing the phenomenon. However, these tools are not limited to relevant technological forms of data collections and analysis programs; it…
ANALYSIS OF LEAD IN CANDLE PARTICULATE EMISSIONS BY XRF USING UNIQUANT 4
As part of an extensive program to study the small combustion sources of indoor fine particulate matter (PM), candles with lead-core wicks were burned in a 46-L glass flow- through chamber. The particulate emissions with aerodynamic diameters <10 micrometers (PM10) were captured ...
Paramedir: A Tool for Programmable Performance Analysis
NASA Technical Reports Server (NTRS)
Jost, Gabriele; Labarta, Jesus; Gimenez, Judit
2004-01-01
Performance analysis of parallel scientific applications is time consuming and requires great expertise in areas such as programming paradigms, system software, and computer hardware architectures. In this paper we describe a tool that facilitates the programmability of performance metric calculations thereby allowing the automation of the analysis and reducing the application development time. We demonstrate how the system can be used to capture knowledge and intuition acquired by advanced parallel programmers in order to be transferred to novice users.
Using timed event sequential data in nursing research.
Pecanac, Kristen E; Doherty-King, Barbara; Yoon, Ju Young; Brown, Roger; Schiefelbein, Tony
2015-01-01
Measuring behavior is important in nursing research, and innovative technologies are needed to capture the "real-life" complexity of behaviors and events. The purpose of this article is to describe the use of timed event sequential data in nursing research and to demonstrate the use of this data in a research study. Timed event sequencing allows the researcher to capture the frequency, duration, and sequence of behaviors as they occur in an observation period and to link the behaviors to contextual details. Timed event sequential data can easily be collected with handheld computers, loaded with a software program designed for capturing observations in real time. Timed event sequential data add considerable strength to analysis of any nursing behavior of interest, which can enhance understanding and lead to improvement in nursing practice.
Description of a user-oriented geographic information system - The resource analysis program
NASA Technical Reports Server (NTRS)
Tilmann, S. E.; Mokma, D. L.
1980-01-01
This paper describes the Resource Analysis Program, an applied geographic information system. Several applications are presented which utilized soil, and other natural resource data, to develop integrated maps and data analyses. These applications demonstrate the methods of analysis and the philosophy of approach used in the mapping system. The applications are evaluated in reference to four major needs of a functional mapping system: data capture, data libraries, data analysis, and mapping and data display. These four criteria are then used to describe an effort to develop the next generation of applied mapping systems. This approach uses inexpensive microcomputers for field applications and should prove to be a viable entry point for users heretofore unable or unwilling to venture into applied computer mapping.
Software Review: A program for testing capture-recapture data for closure
Stanley, Thomas R.; Richards, Jon D.
2005-01-01
Capture-recapture methods are widely used to estimate population parameters of free-ranging animals. Closed-population capture-recapture models, which assume there are no additions to or losses from the population over the period of study (i.e., the closure assumption), are preferred for population estimation over the open-population models, which do not assume closure, because heterogeneity in detection probabilities can be accounted for and this improves estimates. In this paper we introduce CloseTest, a new Microsoft® Windows-based program that computes the Otis et al. (1978) and Stanley and Burnham (1999) closure tests for capture-recapture data sets. Information on CloseTest features and where to obtain the program are provided.
Participatory Data Collection Technique for Capturing Beginning Farmer Program Outcomes
ERIC Educational Resources Information Center
Eschbach, Cheryl L.; Sirrine, J. R.; Lizotte, Erin; Rothwell, N. L.
2016-01-01
This article describes an innovative evaluation plan we employed to capture outcomes of a multiyear beginning farmer program and, specifically, highlights the facilitation technique we used to document short-term and intermediate goals of the program that matched U.S. Department of Agriculture grant requirements and Extension administration…
Alternative Fuels Data Center: Maps and Data
emissions comparison of heavy duty vehicles as captured by the Clean Cities Program. Last update February emissions comparison of light duty vehicles as captured by the Clean Cities Program. Last update February
Alternative Fuels Data Center: Maps and Data
gas emissions comparison of heavy duty vehicles as captured by the Clean Cities Program. Last update gas emissions comparison of light duty vehicles as captured by the Clean Cities Program. Last update
Innovative Assessment Tools for a Short, Fast-Paced, Summer Field Course
ERIC Educational Resources Information Center
Baustian, Melissa M.; Bentley, Samuel J.; Wandersee, James H.
2008-01-01
An experiential science program, such as a summer course at a field station, requires unique assessment tools. Traditional assessment via a pencil-and-paper exam cannot capture the essential skills and concepts learned at a summer field station. Therefore, the authors developed a pre- and postcourse image-based analysis to evaluate student…
A Knowledge Base for FIA Data Uses
Victor A. Rudis
2005-01-01
Knowledge management provides a way to capture the collective wisdom of an organization, facilitate organizational learning, and foster opportunities for improvement. This paper describes a knowledge base compiled from uses of field observations made by the U.S. Department of Agriculture Forest Service, Forest Inventory and Analysis program and a citation database of...
Gibau, Gina Sanchez
2015-01-01
Qualitative studies that examine the experiences of underrepresented minority students in science, technology, engineering, and mathematics fields are comparatively few. This study explores the self-reported experiences of underrepresented graduate students in the biomedical sciences of a large, midwestern, urban university. Document analysis of interview transcripts from program evaluations capture firsthand accounts of student experiences and reveal the need for a critical examination of current intervention programs designed to reverse the trend of underrepresentation in the biomedical sciences. Findings point to themes aligned around the benefits and challenges of program components, issues of social adjustment, the utility of supportive relationships, and environmental impacts. PMID:26163562
Almberg, Kirsten S; Friedman, Lee S; Swedler, David; Cohen, Robert A
2018-05-01
The Mine Safety and Health Administration (MSHA) requires reporting of injuries and illnesses to their Part 50 program. A 2011 study indicated that the Part 50 program did not capture many cases of injury in Kentucky, causing concern about underreporting in other states. MSHA Part 50 reports from Illinois for 2001-2013 were linked to Illinois Workers' Compensation Commission (IWCC) data. IWCC cases not found in the Part 50 data were considered unreported. Overall, the Part 50 Program did not capture 66% of IWCC cases from 2001 to 2013. Chronic injuries or illnesses were more likely to be unreported to MSHA. The majority of occupational injuries and illnesses found in the IWCC from this time period, were not captured by Part 50. Inaccurate reporting of injuries and illnesses to the Part 50 program hinders MSHA's ability to enforce safety and health standards in the mining industry. © 2018 Wiley Periodicals, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
N /A
2003-12-18
The CTUIR and ODFW propose to expand their monitoring and evaluation for the Grande Ronde spring chinook supplementation program to take additional data on summer steelhead that are trapped at the existing adult collection weirs on the upper Grande Ronde River and Catherine Creek. The weirs are a movable design and are operated seasonally during the adult chinook migration. Bull trout and summer steelhead have been trapped at the weirs since 1997 incidental to the spring chinook broodstock collection activities. Minimal data is recorded on both species as a requirement of the ESA permits, and reported to USFWS and NOAAmore » Fisheries. This supplement analysis covers a minor expansion of the program to collect more extensive life history data on summer steelhead. The weir and trap will be installed 2-3 weeks earlier (early to mid-March) than was previously needed for the spring chinook broodstock collection in order to monitor the summer steelhead migration period. The adult steelhead will be captured in the traps, anesthetized, and measured. Data will be recorded on the date of capture, fork length, sex, markings, and maturity of the fish, and scale and punch tissue samples will be taken for genetic analyses.« less
Value flow mapping: Using networks to inform stakeholder analysis
NASA Astrophysics Data System (ADS)
Cameron, Bruce G.; Crawley, Edward F.; Loureiro, Geilson; Rebentisch, Eric S.
2008-02-01
Stakeholder theory has garnered significant interest from the corporate community, but has proved difficult to apply to large government programs. A detailed value flow exercise was conducted to identify the value delivery mechanisms among stakeholders for the current Vision for Space Exploration. We propose a method for capturing stakeholder needs that explicitly recognizes the outcomes required of the value creating organization. The captured stakeholder needs are then translated into input-output models for each stakeholder, which are then aggregated into a network model. Analysis of this network suggests that benefits are infrequently linked to the root provider of value. Furthermore, it is noted that requirements should not only be written to influence the organization's outputs, but also to influence the propagation of benefit further along the value chain. A number of future applications of this model to systems architecture and requirement analysis are discussed.
INEEL BNCT research program. Annual report, January 1, 1996--December 31, 1996
DOE Office of Scientific and Technical Information (OSTI.GOV)
Venhuizen, J.R.
1997-04-01
This report is a summary of the progress and research produced for the Idaho National Engineering and Environmental Laboratory (INEEL) Boron Neutron Capture Therapy (BNCT) Research Program for calendar year 1996. Contributions from the individual investigators about their projects are included, specifically, physics: treatment planning software, real-time neutron beam measurement dosimetry, measurement of the Finnish research reactor epithermal neutron spectrum, BNCT accelerator technology; and chemistry: analysis of biological samples and preparation of {sup 10}B enriched decaborane.
NASA Astrophysics Data System (ADS)
Piao, Chunhui; Han, Xufang; Wu, Harris
2010-08-01
We provide a formal definition of an e-commerce transaction network. Agent-based modelling is used to simulate e-commerce transaction networks. For real-world analysis, we studied the open application programming interfaces (APIs) from eBay and Taobao e-commerce websites and captured real transaction data. Pajek is used to visualise the agent relationships in the transaction network. We derived one-mode networks from the transaction network and analysed them using degree and betweenness centrality. Integrating multi-agent modelling, open APIs and social network analysis, we propose a new way to study large-scale e-commerce systems.
Capture zones for simple aquifers
McElwee, Carl D.
1991-01-01
Capture zones showing the area influenced by a well within a certain time are useful for both aquifer protection and cleanup. If hydrodynamic dispersion is neglected, a deterministic curve defines the capture zone. Analytical expressions for the capture zones can be derived for simple aquifers. However, the capture zone equations are transcendental and cannot be explicitly solved for the coordinates of the capture zone boundary. Fortunately, an iterative scheme allows the solution to proceed quickly and efficiently even on a modest personal computer. Three forms of the analytical solution must be used in an iterative scheme to cover the entire region of interest, after the extreme values of the x coordinate are determined by an iterative solution. The resulting solution is a discrete one, and usually 100-1000 intervals along the x-axis are necessary for a smooth definition of the capture zone. The presented program is written in FORTRAN and has been used in a variety of computing environments. No graphics capability is included with the program; it is assumed the user has access to a commercial package. The superposition of capture zones for multiple wells is expected to be satisfactory if the spacing is not too close. Because this program deals with simple aquifers, the results rarely will be the final word in a real application.
Gopalaswamy, Arjun M.; Royle, J. Andrew; Hines, James E.; Singh, Pallavi; Jathanna, Devcharan; Kumar, N. Samba; Karanth, K. Ullas
2012-01-01
1. The advent of spatially explicit capture-recapture models is changing the way ecologists analyse capture-recapture data. However, the advantages offered by these new models are not fully exploited because they can be difficult to implement. 2. To address this need, we developed a user-friendly software package, created within the R programming environment, called SPACECAP. This package implements Bayesian spatially explicit hierarchical models to analyse spatial capture-recapture data. 3. Given that a large number of field biologists prefer software with graphical user interfaces for analysing their data, SPACECAP is particularly useful as a tool to increase the adoption of Bayesian spatially explicit capture-recapture methods in practice.
ERIC Educational Resources Information Center
Bueno de Mesquita, Paul; Dean, Ross F.; Young, Betty J.
2010-01-01
Advances in digital video technology create opportunities for more detailed qualitative analyses of actual teaching practice in science and other subject areas. User-friendly digital cameras and highly developed, flexible video-analysis software programs have made the tasks of video capture, editing, transcription, and subsequent data analysis…
ERIC Educational Resources Information Center
Dunbar, Laura
2014-01-01
This article is an introduction to video screen capture. Basic information of two software programs, QuickTime for Mac and BlueBerry Flashback Express for PC, are also discussed. Practical applications for video screen capture are given.
Analysis on laser plasma emission for characterization of colloids by video-based computer program
NASA Astrophysics Data System (ADS)
Putri, Kirana Yuniati; Lumbantoruan, Hendra Damos; Isnaeni
2016-02-01
Laser-induced breakdown detection (LIBD) is a sensitive technique for characterization of colloids with small size and low concentration. There are two types of detection, optical and acoustic. Optical LIBD employs CCD camera to capture the plasma emission and uses the information to quantify the colloids. This technique requires sophisticated technology which is often pricey. In order to build a simple, home-made LIBD system, a dedicated computer program based on MATLAB™ for analyzing laser plasma emission was developed. The analysis was conducted by counting the number of plasma emissions (breakdowns) during a certain period of time. Breakdown probability provided information on colloid size and concentration. Validation experiment showed that the computer program performed well on analyzing the plasma emissions. Optical LIBD has A graphical user interface (GUI) was also developed to make the program more user-friendly.
Leif Mortenson
2015-01-01
Globally, national forest inventories (NFI) require a large work force typically consisting of multiple teams spread across multiple locations in order to successfully capture a given nationâs forest resources. This is true of the Forest Inventory and Analysis (FIA) program in the US and in many inventories in developing countries that are supported by USFS...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-29
... Charles Carbon Capture and Sequestration Project, Lake Charles, LA AGENCY: Department of Energy. ACTION... competitive process under the Industrial Carbon Capture and Sequestration (ICCS) Program. The Lake Charles Carbon Capture and Sequestration Project (Lake Charles CCS Project) would demonstrate: (1) advanced...
Influence of Smartphones and Software on Acoustic Voice Measures
GRILLO, ELIZABETH U.; BROSIOUS, JENNA N.; SORRELL, STACI L.; ANAND, SUPRAJA
2016-01-01
This study assessed the within-subject variability of voice measures captured using different recording devices (i.e., smartphones and head mounted microphone) and software programs (i.e., Analysis of Dysphonia in Speech and Voice (ADSV), Multi-dimensional Voice Program (MDVP), and Praat). Correlations between the software programs that calculated the voice measures were also analyzed. Results demonstrated no significant within-subject variability across devices and software and that some of the measures were highly correlated across software programs. The study suggests that certain smartphones may be appropriate to record daily voice measures representing the effects of vocal loading within individuals. In addition, even though different algorithms are used to compute voice measures across software programs, some of the programs and measures share a similar relationship. PMID:28775797
Lunar Exploration Architecture Level Key Drivers and Sensitivities
NASA Technical Reports Server (NTRS)
Goodliff, Kandyce; Cirillo, William; Earle, Kevin; Reeves, J. D.; Shyface, Hilary; Andraschko, Mark; Merrill, R. Gabe; Stromgren, Chel; Cirillo, Christopher
2009-01-01
Strategic level analysis of the integrated behavior of lunar transportation and lunar surface systems architecture options is performed to assess the benefit, viability, affordability, and robustness of system design choices. This analysis employs both deterministic and probabilistic modeling techniques so that the extent of potential future uncertainties associated with each option are properly characterized. The results of these analyses are summarized in a predefined set of high-level Figures of Merit (FOMs) so as to provide senior NASA Constellation Program (CxP) and Exploration Systems Mission Directorate (ESMD) management with pertinent information to better inform strategic level decision making. The strategic level exploration architecture model is designed to perform analysis at as high a level as possible but still capture those details that have major impacts on system performance. The strategic analysis methodology focuses on integrated performance, affordability, and risk analysis, and captures the linkages and feedbacks between these three areas. Each of these results leads into the determination of the high-level FOMs. This strategic level analysis methodology has been previously applied to Space Shuttle and International Space Station assessments and is now being applied to the development of the Constellation Program point-of-departure lunar architecture. This paper provides an overview of the strategic analysis methodology and the lunar exploration architecture analyses to date. In studying these analysis results, the strategic analysis team has identified and characterized key drivers affecting the integrated architecture behavior. These key drivers include inclusion of a cargo lander, mission rate, mission location, fixed-versus- variable costs/return on investment, and the requirement for probabilistic analysis. Results of sensitivity analysis performed on lunar exploration architecture scenarios are also presented.
Architecture of the software for LAMOST fiber positioning subsystem
NASA Astrophysics Data System (ADS)
Peng, Xiaobo; Xing, Xiaozheng; Hu, Hongzhuan; Zhai, Chao; Li, Weimin
2004-09-01
The architecture of the software which controls the LAMOST fiber positioning sub-system is described. The software is composed of two parts as follows: a main control program in a computer and a unit controller program in a MCS51 single chip microcomputer ROM. And the function of the software includes: Client/Server model establishment, observation planning, collision handling, data transmission, pulse generation, CCD control, image capture and processing, and data analysis etc. Particular attention is paid to the ways in which different parts of the software can communicate. Also software techniques for multi threads, SOCKET programming, Microsoft Windows message response, and serial communications are discussed.
PACE and the Medicare+Choice risk-adjusted payment model.
Temkin-Greener, H; Meiners, M R; Gruenberg, L
2001-01-01
This paper investigates the impact of the Medicare principal inpatient diagnostic cost group (PIP-DCG) payment model on the Program of All-Inclusive Care for the Elderly (PACE). Currently, more than 6,000 Medicare beneficiaries who are nursing home certifiable receive care from PACE, a program poised for expansion under the Balanced Budget Act of 1997. Overall, our analysis suggests that the application of the PIP-DCG model to the PACE program would reduce Medicare payments to PACE, on average, by 38%. The PIP-DCG payment model bases its risk adjustment on inpatient diagnoses and does not capture adequately the risk of caring for a population with functional impairments.
Microbial community analysis using MEGAN.
Huson, Daniel H; Weber, Nico
2013-01-01
Metagenomics, the study of microbes in the environment using DNA sequencing, depends upon dedicated software tools for processing and analyzing very large sequencing datasets. One such tool is MEGAN (MEtaGenome ANalyzer), which can be used to interactively analyze and compare metagenomic and metatranscriptomic data, both taxonomically and functionally. To perform a taxonomic analysis, the program places the reads onto the NCBI taxonomy, while functional analysis is performed by mapping reads to the SEED, COG, and KEGG classifications. Samples can be compared taxonomically and functionally, using a wide range of different charting and visualization techniques. PCoA analysis and clustering methods allow high-level comparison of large numbers of samples. Different attributes of the samples can be captured and used within analysis. The program supports various input formats for loading data and can export analysis results in different text-based and graphical formats. The program is designed to work with very large samples containing many millions of reads. It is written in Java and installers for the three major computer operating systems are available from http://www-ab.informatik.uni-tuebingen.de. © 2013 Elsevier Inc. All rights reserved.
Automated Verification of Specifications with Typestates and Access Permissions
NASA Technical Reports Server (NTRS)
Siminiceanu, Radu I.; Catano, Nestor
2011-01-01
We propose an approach to formally verify Plural specifications based on access permissions and typestates, by model-checking automatically generated abstract state-machines. Our exhaustive approach captures all the possible behaviors of abstract concurrent programs implementing the specification. We describe the formal methodology employed by our technique and provide an example as proof of concept for the state-machine construction rules. The implementation of a fully automated algorithm to generate and verify models, currently underway, provides model checking support for the Plural tool, which currently supports only program verification via data flow analysis (DFA).
McElroy, Lisa M; Khorzad, Rebeca; Rowe, Theresa A; Abecassis, Zachary A; Apley, Daniel W; Barnard, Cynthia; Holl, Jane L
The purpose of this study was to use fault tree analysis to evaluate the adequacy of quality reporting programs in identifying root causes of postoperative bloodstream infection (BSI). A systematic review of the literature was used to construct a fault tree to evaluate 3 postoperative BSI reporting programs: National Surgical Quality Improvement Program (NSQIP), Centers for Medicare and Medicaid Services (CMS), and The Joint Commission (JC). The literature review revealed 699 eligible publications, 90 of which were used to create the fault tree containing 105 faults. A total of 14 identified faults are currently mandated for reporting to NSQIP, 5 to CMS, and 3 to JC; 2 or more programs require 4 identified faults. The fault tree identifies numerous contributing faults to postoperative BSI and reveals substantial variation in the requirements and ability of national quality data reporting programs to capture these potential faults. Efforts to prevent postoperative BSI require more comprehensive data collection to identify the root causes and develop high-reliability improvement strategies.
Integrated operations/payloads/fleet analysis. Volume 2: Payloads
NASA Technical Reports Server (NTRS)
1971-01-01
The payloads for NASA and non-NASA missions of the integrated fleet are analyzed to generate payload data for the capture and cost analyses for the period 1979 to 1990. Most of the effort is on earth satellites, probes, and planetary missions because of the space shuttle's ability to retrieve payloads for repair, overhaul, and maintenance. Four types of payloads are considered: current expendable payload; current reusable payload; low cost expendable payload, (satellite to be used with expendable launch vehicles); and low cost reusable payload (satellite to be used with the space shuttle/space tug system). Payload weight analysis, structural sizing analysis, and the influence of mean mission duration on program cost are also discussed. The payload data were computerized, and printouts of the data for payloads for each program or mission are included.
A Categorization of Dynamic Analyzers
NASA Technical Reports Server (NTRS)
Lujan, Michelle R.
1997-01-01
Program analysis techniques and tools are essential to the development process because of the support they provide in detecting errors and deficiencies at different phases of development. The types of information rendered through analysis includes the following: statistical measurements of code, type checks, dataflow analysis, consistency checks, test data,verification of code, and debugging information. Analyzers can be broken into two major categories: dynamic and static. Static analyzers examine programs with respect to syntax errors and structural properties., This includes gathering statistical information on program content, such as the number of lines of executable code, source lines. and cyclomatic complexity. In addition, static analyzers provide the ability to check for the consistency of programs with respect to variables. Dynamic analyzers in contrast are dependent on input and the execution of a program providing the ability to find errors that cannot be detected through the use of static analysis alone. Dynamic analysis provides information on the behavior of a program rather than on the syntax. Both types of analysis detect errors in a program, but dynamic analyzers accomplish this through run-time behavior. This paper focuses on the following broad classification of dynamic analyzers: 1) Metrics; 2) Models; and 3) Monitors. Metrics are those analyzers that provide measurement. The next category, models, captures those analyzers that present the state of the program to the user at specified points in time. The last category, monitors, checks specified code based on some criteria. The paper discusses each classification and the techniques that are included under them. In addition, the role of each technique in the software life cycle is discussed. Familiarization with the tools that measure, model and monitor programs provides a framework for understanding the program's dynamic behavior from different, perspectives through analysis of the input/output data.
AACSD: An atomistic analyzer for crystal structure and defects
NASA Astrophysics Data System (ADS)
Liu, Z. R.; Zhang, R. F.
2018-01-01
We have developed an efficient command-line program named AACSD (Atomistic Analyzer for Crystal Structure and Defects) for the post-analysis of atomic configurations generated by various atomistic simulation codes. The program has implemented not only the traditional filter methods like the excess potential energy (EPE), the centrosymmetry parameter (CSP), the common neighbor analysis (CNA), the common neighborhood parameter (CNP), the bond angle analysis (BAA), and the neighbor distance analysis (NDA), but also the newly developed ones including the modified centrosymmetry parameter (m-CSP), the orientation imaging map (OIM) and the local crystallographic orientation (LCO). The newly proposed OIM and LCO methods have been extended for all three crystal structures including face centered cubic, body centered cubic and hexagonal close packed. More specially, AACSD can be easily used for the atomistic analysis of metallic nanocomposite with each phase to be analyzed independently, which provides a unique pathway to capture their dynamic evolution of various defects on the fly. In this paper, we provide not only a throughout overview on various theoretical methods and their implementation into AACSD program, but some critical evaluations, specific testing and applications, demonstrating the capability of the program on each functionality.
75 FR 66420 - ITS Joint Program Office; IntelliDriveSM
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-28
... issues; details of the IntelliDrive program's open data environment and its open source mobility... host a free two-day public workshop to discuss the IntelliDrive\\SM\\ Real-Time Data Capture and... communicate with stakeholders interested in the data capture and dynamic mobility components of the Intelli...
Onsite and Electric Backup Capabilities at Critical Infrastructure Facilities in the United States
DOE Office of Scientific and Technical Information (OSTI.GOV)
Phillips, Julia A.; Wallace, Kelly E.; Kudo, Terence Y.
2016-04-01
The following analysis, conducted by Argonne National Laboratory’s (Argonne’s) Risk and Infrastructure Science Center (RISC), details an analysis of electric power backup of national critical infrastructure as captured through the Department of Homeland Security’s (DHS’s) Enhanced Critical Infrastructure Program (ECIP) Initiative. Between January 1, 2011, and September 2014, 3,174 ECIP facility surveys have been conducted. This study focused first on backup capabilities by infrastructure type and then expanded to infrastructure type by census region.
Field, Christopher R.; Lubrano, Adam; Woytowitz, Morgan; Giordano, Braden C.; Rose-Pehrsson, Susan L.
2014-01-01
The direct liquid deposition of solution standards onto sorbent-filled thermal desorption tubes is used for the quantitative analysis of trace explosive vapor samples. The direct liquid deposition method yields a higher fidelity between the analysis of vapor samples and the analysis of solution standards than using separate injection methods for vapors and solutions, i.e., samples collected on vapor collection tubes and standards prepared in solution vials. Additionally, the method can account for instrumentation losses, which makes it ideal for minimizing variability and quantitative trace chemical detection. Gas chromatography with an electron capture detector is an instrumentation configuration sensitive to nitro-energetics, such as TNT and RDX, due to their relatively high electron affinity. However, vapor quantitation of these compounds is difficult without viable vapor standards. Thus, we eliminate the requirement for vapor standards by combining the sensitivity of the instrumentation with a direct liquid deposition protocol to analyze trace explosive vapor samples. PMID:25145416
Field, Christopher R; Lubrano, Adam; Woytowitz, Morgan; Giordano, Braden C; Rose-Pehrsson, Susan L
2014-07-25
The direct liquid deposition of solution standards onto sorbent-filled thermal desorption tubes is used for the quantitative analysis of trace explosive vapor samples. The direct liquid deposition method yields a higher fidelity between the analysis of vapor samples and the analysis of solution standards than using separate injection methods for vapors and solutions, i.e., samples collected on vapor collection tubes and standards prepared in solution vials. Additionally, the method can account for instrumentation losses, which makes it ideal for minimizing variability and quantitative trace chemical detection. Gas chromatography with an electron capture detector is an instrumentation configuration sensitive to nitro-energetics, such as TNT and RDX, due to their relatively high electron affinity. However, vapor quantitation of these compounds is difficult without viable vapor standards. Thus, we eliminate the requirement for vapor standards by combining the sensitivity of the instrumentation with a direct liquid deposition protocol to analyze trace explosive vapor samples.
Knippenberg, Els; Verbrugghe, Jonas; Lamers, Ilse; Palmaers, Steven; Timmermans, Annick; Spooren, Annemie
2017-06-24
Client-centred task-oriented training is important in neurological rehabilitation but is time consuming and costly in clinical practice. The use of technology, especially motion capture systems (MCS) which are low cost and easy to apply in clinical practice, may be used to support this kind of training, but knowledge and evidence of their use for training is scarce. The present review aims to investigate 1) which motion capture systems are used as training devices in neurological rehabilitation, 2) how they are applied, 3) in which target population, 4) what the content of the training and 5) efficacy of training with MCS is. A computerised systematic literature review was conducted in four databases (PubMed, Cinahl, Cochrane Database and IEEE). The following MeSH terms and key words were used: Motion, Movement, Detection, Capture, Kinect, Rehabilitation, Nervous System Diseases, Multiple Sclerosis, Stroke, Spinal Cord, Parkinson Disease, Cerebral Palsy and Traumatic Brain Injury. The Van Tulder's Quality assessment was used to score the methodological quality of the selected studies. The descriptive analysis is reported by MCS, target population, training parameters and training efficacy. Eighteen studies were selected (mean Van Tulder score = 8.06 ± 3.67). Based on methodological quality, six studies were selected for analysis of training efficacy. Most commonly used MCS was Microsoft Kinect, training was mostly conducted in upper limb stroke rehabilitation. Training programs varied in intensity, frequency and content. None of the studies reported an individualised training program based on client-centred approach. Motion capture systems are training devices with potential in neurological rehabilitation to increase the motivation during training and may assist improvement on one or more International Classification of Functioning, Disability and Health (ICF) levels. Although client-centred task-oriented training is important in neurological rehabilitation, the client-centred approach was not included. Future technological developments should take up the challenge to combine MCS with the principles of a client-centred task-oriented approach and prove efficacy using randomised controlled trials with long-term follow-up. Prospero registration number 42016035582 .
Parallel Wavefront Analysis for a 4D Interferometer
NASA Technical Reports Server (NTRS)
Rao, Shanti R.
2011-01-01
This software provides a programming interface for automating data collection with a PhaseCam interferometer from 4D Technology, and distributing the image-processing algorithm across a cluster of general-purpose computers. Multiple instances of 4Sight (4D Technology s proprietary software) run on a networked cluster of computers. Each connects to a single server (the controller) and waits for instructions. The controller directs the interferometer to several images, then assigns each image to a different computer for processing. When the image processing is finished, the server directs one of the computers to collate and combine the processed images, saving the resulting measurement in a file on a disk. The available software captures approximately 100 images and analyzes them immediately. This software separates the capture and analysis processes, so that analysis can be done at a different time and faster by running the algorithm in parallel across several processors. The PhaseCam family of interferometers can measure an optical system in milliseconds, but it takes many seconds to process the data so that it is usable. In characterizing an adaptive optics system, like the next generation of astronomical observatories, thousands of measurements are required, and the processing time quickly becomes excessive. A programming interface distributes data processing for a PhaseCam interferometer across a Windows computing cluster. A scriptable controller program coordinates data acquisition from the interferometer, storage on networked hard disks, and parallel processing. Idle time of the interferometer is minimized. This architecture is implemented in Python and JavaScript, and may be altered to fit a customer s needs.
Evaluation of trap capture in a geographically closed population of brown treesnakes on Guam
Tyrrell, C.L.; Christy, M.T.; Rodda, G.H.; Yackel Adams, A.A.; Ellingson, A.R.; Savidge, J.A.; Dean-Bradley, K.; Bischof, R.
2009-01-01
1. Open population mark-recapture analysis of unbounded populations accommodates some types of closure violations (e.g. emigration, immigration). In contrast, closed population analysis of such populations readily allows estimation of capture heterogeneity and behavioural response, but requires crucial assumptions about closure (e.g. no permanent emigration) that are suspect and rarely tested empirically. 2. In 2003, we erected a double-sided barrier to prevent movement of snakes in or out of a 5-ha semi-forested study site in northern Guam. This geographically closed population of >100 snakes was monitored using a series of transects for visual searches and a 13 ?? 13 trapping array, with the aim of marking all snakes within the site. Forty-five marked snakes were also supplemented into the resident population to quantify the efficacy of our sampling methods. We used the program mark to analyse trap captures (101 occasions), referenced to census data from visual surveys, and quantified heterogeneity, behavioural response, and size bias in trappability. Analytical inclusion of untrapped individuals greatly improved precision in the estimation of some covariate effects. 3. A novel discovery was that trap captures for individual snakes consisted of asynchronous bouts of high capture probability lasting about 7 days (ephemeral behavioural effect). There was modest behavioural response (trap happiness) and significant latent (unexplained) heterogeneity, with small influences on capture success of date, gender, residency status (translocated or not), and body condition. 4. Trapping was shown to be an effective tool for eradicating large brown treesnakes Boiga irregularis (>900 mm snout-vent length, SVL). 5. Synthesis and applications. Mark-recapture modelling is commonly used by ecological managers to estimate populations. However, existing models involve making assumptions about either closure violations or response to capture. Physical closure of our population on a landscape scale allowed us to determine the relative importance of covariates influencing capture probability (body size, trappability periods, and latent heterogeneity). This information was used to develop models in which different segments of the population could be assigned different probabilities of capture, and suggests that modelling of open populations should incorporate easily measured, but potentially overlooked, parameters such as body size or condition. ?? 2008 The Authors.
Space Shuttle Guidance, Navigation, and Rendezvous Knowledge Capture Reports. Revision 1
NASA Technical Reports Server (NTRS)
Goodman, John L.
2011-01-01
This document is a catalog and readers guide to lessons learned, experience, and technical history reports, as well as compilation volumes prepared by United Space Alliance personnel for the NASA/Johnson Space Center (JSC) Flight Dynamics Division.1 It is intended to make it easier for future generations of engineers to locate knowledge capture documentation from the Shuttle Program. The first chapter covers observations on documentation quality and research challenges encountered during the Space Shuttle and Orion programs. The second chapter covers the knowledge capture approach used to create many of the reports covered in this document. These chapters are intended to provide future flight programs with insight that could be used to formulate knowledge capture and management strategies. The following chapters contain descriptions of each knowledge capture report. The majority of the reports concern the Space Shuttle. Three are included that were written in support of the Orion Program. Most of the reports were written from the years 2001 to 2011. Lessons learned reports concern primarily the shuttle Global Positioning System (GPS) upgrade and the knowledge capture process. Experience reports on navigation and rendezvous provide examples of how challenges were overcome and how best practices were identified and applied. Some reports are of a more technical history nature covering navigation and rendezvous. They provide an overview of mission activities and the evolution of operations concepts and trajectory design. The lessons learned, experience, and history reports would be considered secondary sources by historians and archivists.
U.S. Spacesuit Knowledge Capture Series Catalog
NASA Technical Reports Server (NTRS)
Bitterly, Rose; Oliva, Vladenka
2012-01-01
The National Aeronautics and Space Administration (NASA) and other organizations have been performing U.S. Spacesuit Knowledge Capture (USSKC) since the beginning of space exploration through published reports, conference presentations, specialized seminars, and classes instructed by veterans in the field. The close physical interaction between spacesuit systems and human beings makes them among the most personally evocative pieces of space hardware. Consequently, spacesuit systems have required nearly constant engineering refinements to do their jobs without impinging on human activity. Since 2008, spacesuit knowledge capture has occurred through video recording, engaging both current and former specialists presenting technical scope specifically to educate individuals and preserve knowledge. These archives of spacesuit legacy reflect its rich history and will provide knowledge that will enhance the chances for the success of future and more ambitious spacesuit system programs. The scope and topics of USSKC have included lessons learned in spacesuit technology; experience from the Gemini, Apollo, Skylab, and Shuttle Programs; the process of hardware certification, design, development, and other program components; spacesuit evolution and experience; failure analysis and resolution; and aspects of program management. USSKC activities have progressed to a level where NASA, the National Air and Space Museum (NASM), Hamilton Sundstrand (HS) and the spacesuit community are now working together to provide a comprehensive way to organize and archive intra-agency information related to the development of spacesuit systems. These video recordings are currently being reviewed for public release using NASA export control processes. After a decision is made for either public or non-public release (internal NASA only), the videos and presentations will be available through the NASA Johnson Space Center Engineering Directorate (EA) Engineering Academy, the NASA Technical Reports Server (NTRS), the NASA Aeronautics & Space Database (NA&SD), or NASA YouTube. Event availability is duly noted in this catalog.
Tautin, J.; Lebreton, J.-D.; North, P.M.
1993-01-01
Capture-recapture methodology has advanced greatly in the last twenty years and is now a major factor driving the continuing evolution of the North American bird banding program. Bird banding studies are becoming more scientific with improved study designs and analytical procedures. Researchers and managers are gaining more reliable knowledge which in turn betters the conservation of migratory birds. The advances in capture-recapture methodology have benefited gamebird studies primarily, but nongame bird studies will benefit similarly as they expand greatly in the next decade. Further theoretical development of capture-recapture methodology should be encouraged, and, to maximize benefits of the methodology, work on practical applications should be increased.
Best Practices for Reliable and Robust Spacecraft Structures
NASA Technical Reports Server (NTRS)
Raju, Ivatury S.; Murthy, P. L. N.; Patel, Naresh R.; Bonacuse, Peter J.; Elliott, Kenny B.; Gordon, S. A.; Gyekenyesi, J. P.; Daso, E. O.; Aggarwal, P.; Tillman, R. F.
2007-01-01
A study was undertaken to capture the best practices for the development of reliable and robust spacecraft structures for NASA s next generation cargo and crewed launch vehicles. In this study, the NASA heritage programs such as Mercury, Gemini, Apollo, and the Space Shuttle program were examined. A series of lessons learned during the NASA and DoD heritage programs are captured. The processes that "make the right structural system" are examined along with the processes to "make the structural system right". The impact of technology advancements in materials and analysis and testing methods on reliability and robustness of spacecraft structures is studied. The best practices and lessons learned are extracted from these studies. Since the first human space flight, the best practices for reliable and robust spacecraft structures appear to be well established, understood, and articulated by each generation of designers and engineers. However, these best practices apparently have not always been followed. When the best practices are ignored or short cuts are taken, risks accumulate, and reliability suffers. Thus program managers need to be vigilant of circumstances and situations that tend to violate best practices. Adherence to the best practices may help develop spacecraft systems with high reliability and robustness against certain anomalies and unforeseen events.
Eid, Daniel; Guzman-Rivero, Miguel; Rojas, Ernesto; Goicolea, Isabel; Hurtig, Anna-Karin; Illanes, Daniel; San Sebastian, Miguel
2018-01-01
This study evaluates the level of underreporting of the National Program of Leishmaniasis Control (NPLC) in two communities of Cochabamba, Bolivia during the period 2013-2014. Montenegro skin test-confirmed cases of cutaneous leishmaniasis (CL) were identified through active surveillance during medical campaigns. These cases were compared with those registered in the NPLC by passive surveillance. After matching and cleaning data from the two sources, the total number of cases and the level of underreporting of the National Program were calculated using the capture-recapture analysis. This estimated that 86 cases of CL (95% confidence interval [CI]: 62.1-110.8) occurred in the study period in both communities. The level of underreporting of the NPLC in these communities was very high: 73.4% (95% CI: 63.1-81.5%). These results can be explained by the inaccessibility of health services and centralization of the NPLC activities. This information is important to establish priorities among policy-makers and funding organizations as well as implementing adequate intervention plans.
Photography Basics. Capturing the Essence of Physical Education and Sport Programs.
ERIC Educational Resources Information Center
Kluka, Darlene A.; Mitchell, Carolyn B.
1990-01-01
The physical educator or coach may be responsible for marketing programs to the public, and skill in 35mm photography can help. Ingredients necessary for successful 35mm movement photography are discussed: knowledge of the movement and the appropriate equipment; techniques for capturing movement; positioning for the ultimate shot; and practice.…
The Automated Instrumentation and Monitoring System (AIMS): Design and Architecture. 3.2
NASA Technical Reports Server (NTRS)
Yan, Jerry C.; Schmidt, Melisa; Schulbach, Cathy; Bailey, David (Technical Monitor)
1997-01-01
Whether a researcher is designing the 'next parallel programming paradigm', another 'scalable multiprocessor' or investigating resource allocation algorithms for multiprocessors, a facility that enables parallel program execution to be captured and displayed is invaluable. Careful analysis of such information can help computer and software architects to capture, and therefore, exploit behavioral variations among/within various parallel programs to take advantage of specific hardware characteristics. A software tool-set that facilitates performance evaluation of parallel applications on multiprocessors has been put together at NASA Ames Research Center under the sponsorship of NASA's High Performance Computing and Communications Program over the past five years. The Automated Instrumentation and Monitoring Systematic has three major software components: a source code instrumentor which automatically inserts active event recorders into program source code before compilation; a run-time performance monitoring library which collects performance data; and a visualization tool-set which reconstructs program execution based on the data collected. Besides being used as a prototype for developing new techniques for instrumenting, monitoring and presenting parallel program execution, AIMS is also being incorporated into the run-time environments of various hardware testbeds to evaluate their impact on user productivity. Currently, the execution of FORTRAN and C programs on the Intel Paragon and PALM workstations can be automatically instrumented and monitored. Performance data thus collected can be displayed graphically on various workstations. The process of performance tuning with AIMS will be illustrated using various NAB Parallel Benchmarks. This report includes a description of the internal architecture of AIMS and a listing of the source code.
2013-01-07
Contingency Operations Task Force, 2011, p. 4)...........................68 Figure 25. Original Organizational Makeup for the CASO (After Deputy...Workforce CAP Civilian Augmentation Program CAP Crisis Action Planning CASO Contingency Acquisition Support Office CBP Capability-Based...its inclusion in joint exercises; Identify and assign responsibilities to institutionalize OCS lesson development, analysis, documentation and use
The Tissue Analysis Core (TAC) within the AIDS and Cancer Virus Program will process, embed, and perform microtomy on fixed tissue samples presented in ethanol. CD4 (DAB) and CD68/CD163 (FastRed) double immunohistochemistry will be performed, allowin
Sparger system for MMH-helium vents
NASA Technical Reports Server (NTRS)
Rakow, A.
1983-01-01
Based on a calculated vent flow rate and MMH concentration, a TI-59 program was run to determine total sparger hole area for a given sparger inlet pressure. Hole diameter is determined from a mass transfer analysis in the holding tank to achieve complete capture of MMH. In addition, based on oxidation kinetics and vapor pressure data, MMh atmospheric concentrations are determined 2 ft above the holding tank.
2012-12-14
Each pair of rollers is designed to capture the shafts mounted to both ends of the tool lid. Additionally, a safety pin can be put in place to...ITRB for the AH-64D. The scope of the program included structural design , materials selection, manufacturing producibility analysis, tooling design ...responsible for tooling design and fabrication, fabrication process development and fabrication of spars and test samples; G3 who designed the RTM
DOE Office of Scientific and Technical Information (OSTI.GOV)
Becker, Stephen Allan
2016-01-28
During the astrophysical r-process, multiple neutron captures occur so rapidly on target nuclei that their daughter nuclei generally do not have time to undergo radioactive decay before another neutron is captured. The r-process can be approximately simulated on Earth in certain types of thermonuclear explosions through an analogous process of rapid neutron captures known as the "prompt capture" process. Between 1952 and 1969, 23 nuclear tests were fielded by the US which were involved (at least partially) with the "prompt capture" process. Of these tests, 15 were at least partially successful. Some of these tests were conducted under the Plowsharemore » Peaceful Nuclear Explosion Program as scientific research experiments. It is now known that the USSR conducted similar nuclear tests during 1966 to 1979. The elements einsteinium and fermium were first discovered by this process. The most successful tests achieved 19 successive neutron captures on the initial target nuclei. A review of the US program, target nuclei used, heavy element yields, scientific achievements of the program, and how some of the results have been used by the astrophysical community is given. Finally, some unanswered questions concerning very neutron-rich nuclei that could potentially have been answered with additional nuclear experiments is presented.« less
Bernal, Jennifer; Lorenzana, Paulina
2002-06-01
Two Likert-type scales for measuring parents' and caretakers' level of satisfaction with the food and nutrition services offered at childcare multi-centers in a peri-urban community in Caracas, were developed and validated. An intentional sample of 20 parents and caretakers were interviewed within the naturalistic-constructivist perspective, to capture their perceptions of distinct aspects of the food and nutrition components of the program. Categories emerged from the interviews that served to construct the items for two scales that measure level of satisfaction of parents and caretakers with the food and nutrition aspects of the program. To validate the scales, they were applied to 73 parents and 32 caretakers. Factor and multiple components analysis showed that overall, the scales explained 61% and 69% of the variation in level of satisfaction of parents and caretakers respectively. Confiability measured with Alpha Cronbach coefficient was 0.74 and 0.77 for parents' and caretakers' scales respectively. These results reveal scales that have content validity and good reliability. Besides, the scales detect specific aspects of the food and nutrition service that should be reinforced or modified, to make the Child-care Centers program more effective and efficient. External validation of the scales is recommended, since they provide an instrument capable of capturing useful information for monitoring and evaluating the Child-care Centers program nation-wide, from the perspective of program managers and parents of program users.
O'Brien, Michelle F; Lee, Rebecca; Cromie, Ruth; Brown, Martin J
2016-04-01
Swan pipes, duck decoys, cage traps, cannon netting, and roundups are widely used to capture waterfowl in order to monitor populations. These methods are often regulated in countries with national ringing or banding programs and are considered to be safe, and thus justifiable given the benefits to conservation. However, few published studies have addressed how frequently injuries and mortalities occur, or the nature of any injuries. In the present study, rates of mortality and injury during captures with the use of these methods carried out by the Wildfowl & Wetlands Trust as part of conservation programs were assessed. The total rate of injury (including mild dermal abrasions) was 0.42% across all species groups, whereas total mortality was 0.1% across all capture methods. Incidence of injury varied among species groups (ducks, geese, swans, and rails), with some, for example, dabbling ducks, at greater risk than others. We also describe techniques used before, during, and after a capture to reduce stress and injury in captured waterfowl. Projects using these or other capture methods should monitor and publish their performance to allow sharing of experience and to reduce risks further.
Spurgeon, Dale W
2016-04-01
Eradication programs for the boll weevil (Anthonomus grandis grandis Boheman) rely on pheromone-baited traps to trigger insecticide treatments and monitor program progress. A key objective of monitoring in these programs is the timely detection of incipient weevil populations to limit or prevent re-infestation. Therefore, improvements in the effectiveness of trapping would enhance efforts to achieve and maintain eradication. Association of pheromone traps with woodlots and other prominent vegetation are reported to increase captures of weevils, but the spatial scale over which this effect occurs is unknown. The influences of trap distance (0, 10, and 20 m) and orientation (leeward or windward) to brush lines on boll weevil captures were examined during three noncropping seasons (October to February) in the Rio Grande Valley of Texas. Differences in numbers of captured weevils and in the probability of capture between traps at 10 or 20 m from brush, although often statistically significant, were generally small and variable. Variations in boll weevil population levels, wind directions, and wind speeds apparently contributed to this variability. In contrast, traps closely associated with brush (0 m) generally captured larger numbers of weevils, and offered a higher probability of weevil capture compared with traps away from brush. These increases in the probability of weevil capture were as high as 30%. Such increases in the ability of traps to detect low-level boll weevil populations indicate trap placement with respect to prominent vegetation is an important consideration in maximizing the effectiveness of trap-based monitoring for the boll weevil.
NASA Astrophysics Data System (ADS)
Rehmer, Donald E.
Analysis of results from a mathematical programming model were examined to 1) determine the least cost options for infrastructure development of geologic storage of CO2 in the Illinois Basin, and 2) perform an analysis of a number of CO2 emission tax and oil price scenarios in order to implement development of the least-cost pipeline networks for distribution of CO2. The model, using mixed integer programming, tested the hypothesis of whether viable EOR sequestration sites can serve as nodal points or hubs to expand the CO2 delivery infrastructure to more distal locations from the emissions sources. This is in contrast to previous model results based on a point-to- point model having direct pipeline segments from each CO2 capture site to each storage sink. There is literature on the spoke and hub problem that relates to airline scheduling as well as maritime shipping. A large-scale ship assignment problem that utilized integer linear programming was run on Excel Solver and described by Mourao et al., (2001). Other literature indicates that aircraft assignment in spoke and hub routes can also be achieved using integer linear programming (Daskin and Panayotopoulos, 1989; Hane et al., 1995). The distribution concept is basically the reverse of the "tree and branch" type (Rothfarb et al., 1970) gathering systems for oil and natural gas that industry has been developing for decades. Model results indicate that the inclusion of hubs as variables in the model yields lower transportation costs for geologic carbon dioxide storage over previous models of point-to-point infrastructure geometries. Tabular results and GIS maps of the selected scenarios illustrate that EOR sites can serve as nodal points or hubs for distribution of CO2 to distal oil field locations as well as deeper saline reservoirs. Revenue amounts and capture percentages both show an improvement over solutions when the hubs are not allowed to come into the solution. Other results indicate that geologic storage of CO2 into saline aquifers does not come into solutions selected by the model until the CO 2 emissions tax approaches 50/tonne. CO2 capture and storage begins to occur when the oil price is above 24.42 a barrel based on the constraints of the model. The annual storage capacity of the basin is nearly maximized when the net price of oil is as low as 40 per barrel and the CO2 emission tax is 60/tonne. The results from every subsequent scenario that was examined by this study demonstrate that EOR utilizing anthropogenically captured CO2 will earn net revenue, and thus represents an economically viable option for CO2 storage in the Illinois Basin.
MSL Lessons Learned and Knowledge Capture
NASA Technical Reports Server (NTRS)
Buxbaum, Karen L.
2012-01-01
The Mars Program has recently been informed of the Planetary Protection Subcommittee (PPS) recommendation, which was endorsed by the NAC, concerning Mars Science Lab (MSL) lessons learned and knowledge capture. The Mars Program has not had an opportunity to consider any decisions specific to the PPS recommendation. Some of the activities recommended by the PPS would involve members of the MSL flight team who are focused on cruise, entry descent & landing, and early surface operations; those activities would have to wait. Members of the MSL planetary protection team at JPL are still available to support MSL lessons learned and knowledge capture; some of the specifically recommended activities have already begun. The Mars Program shares the PPS/NAC concerns about loss of potential information & expertise in planetary protection practice.
Cost estimation model for advanced planetary programs, fourth edition
NASA Technical Reports Server (NTRS)
Spadoni, D. J.
1983-01-01
The development of the planetary program cost model is discussed. The Model was updated to incorporate cost data from the most recent US planetary flight projects and extensively revised to more accurately capture the information in the historical cost data base. This data base is comprised of the historical cost data for 13 unmanned lunar and planetary flight programs. The revision was made with a two fold objective: to increase the flexibility of the model in its ability to deal with the broad scope of scenarios under consideration for future missions, and to maintain and possibly improve upon the confidence in the model's capabilities with an expected accuracy of 20%. The Model development included a labor/cost proxy analysis, selection of the functional forms of the estimating relationships, and test statistics. An analysis of the Model is discussed and two sample applications of the cost model are presented.
U.S. Spacesuit Knowledge Capture Accomplishments in Fiscal Year 2015
NASA Technical Reports Server (NTRS)
Chullen, Cinda; Oliva, Vladenka R.
2016-01-01
The NASA U.S. Spacesuit Knowledge Capture (SKC) Program continues to capture, share, and archive significant spacesuit-related knowledge with engineers and other technical staff and invested entities. Since its 2007 inception, the SKC Program has hosted and recorded more than 75 events. By the end of Fiscal Year (FY) 2015, 40 of these were processed and uploaded to a publically accessible NASA Web site where viewers can expand their knowledge about the spacesuit's evolution, known capabilities and limitations, and lessons learned. Sharing this knowledge with entities beyond NASA can increase not only more people's understanding of the technical effort and importance involved in designing a spacesuit, it can also expand the interest and support in this valuable program that ensures significant knowledge is retained and accessible. This paper discusses the FY 2015 SKC events, the release and accessibility of the approved events, and the program's future plans.
U.S. Spacesuit Knowledge Capture Accomplishments in Fiscal Year 2015
NASA Technical Reports Server (NTRS)
Chullen, Cinda; Oliva, Vladenka R.
2016-01-01
The NASA U.S. Spacesuit Knowledge Capture (SKC) Program continues to capture, share, and archive significant spacesuit-related knowledge with engineers and other technical staff and invested entities. Since its 2007 inception, the SKC Program has hosted and recorded more than 65 events. By the end of Fiscal Year (FY) 2015, 40 of these were processed and uploaded to a publically accessible NASA Web site where viewers can expand their knowledge about the spacesuit's evolution, known capability and limitations, and lessons learned. Sharing this knowledge with entities beyond NASA can increase not only more people's understanding of the technical effort and importance involved in designing a spacesuit, it can also expand the interest and support in this valuable program that ensures significant knowledge is retained and accessible. This paper discusses the FY 2015 SKC events, the release and accessibility of the approved events, and the program's future plans.
Collaboration Practices: An Analysis Within an Army Acquisition Program Office
2014-03-01
comprised of both intrinsic and extrinsic incentives and rewards. According to the survey results, this was not a high factor in this stakeholder group...people back alive.” This quote captures the essence of the type of reward and motivation the stakeholder group shares. It is an intrinsic reward...provide information about their general perceptions of inter-organizational collaborative capacity. They were not asked to think about a specific
Aerocapture Performance Analysis for a Neptune-Triton Exploration Mission
NASA Technical Reports Server (NTRS)
Starr, Brett R.; Westhelle, Carlos H.; Masciarelli, James P.
2004-01-01
A systems analysis has been conducted for a Neptune-Triton Exploration Mission in which aerocapture is used to capture a spacecraft at Neptune. Aerocapture uses aerodynamic drag instead of propulsion to decelerate from the interplanetary approach trajectory to a captured orbit during a single pass through the atmosphere. After capture, propulsion is used to move the spacecraft from the initial captured orbit to the desired science orbit. A preliminary assessment identified that a spacecraft with a lift to drag ratio of 0.8 was required for aerocapture. Performance analyses of the 0.8 L/D vehicle were performed using a high fidelity flight simulation within a Monte Carlo executive to determine mission success statistics. The simulation was the Program to Optimize Simulated Trajectories (POST) modified to include Neptune specific atmospheric and planet models, spacecraft aerodynamic characteristics, and interplanetary trajectory models. To these were added autonomous guidance and pseudo flight controller models. The Monte Carlo analyses incorporated approach trajectory delivery errors, aerodynamic characteristics uncertainties, and atmospheric density variations. Monte Carlo analyses were performed for a reference set of uncertainties and sets of uncertainties modified to produce increased and reduced atmospheric variability. For the reference uncertainties, the 0.8 L/D flatbottom ellipsled vehicle achieves 100% successful capture and has a 99.87 probability of attaining the science orbit with a 360 m/s V budget for apoapsis and periapsis adjustment. Monte Carlo analyses were also performed for a guidance system that modulates both bank angle and angle of attack with the reference set of uncertainties. An alpha and bank modulation guidance system reduces the 99.87 percentile DELTA V 173 m/s (48%) to 187 m/s for the reference set of uncertainties.
Economic impacts of Medicaid in North Carolina.
Dumas, Christopher; Hall, William; Garrett, Patricia
2008-01-01
The purpose of this study is to provide estimates of the economic impacts of Medicaid program expenditures in North Carolina in state fiscal year (SFY) 2003. The study uses input-output analysis to estimate the economic impacts of Medicaid expenditures. The study uses North Carolina Medicaid program expenditure data for SFY 2003 as submitted by the North Carolina Division of Medical Assistance to the federal Centers for Medicare and Medicaid Services (CMS). Industry structure data from 2002 that are part of the IMPLAN input-output modeling software database are also used in the analysis. In SFY 2003 $6.307 billion in Medicaid program expenditures occurred within the state of North Carolina-$3.941 billion federal dollars, $2.014 billion state dollars, and $351 million in local government funds. Each dollar of state and local government expenditures brought $1.67 in federal Medicaid cost-share to the state. The economic impacts within North Carolina of the 2003 Medicaid expenditures included the following: 182,000 jobs supported (including both full-time and some part-time jobs); $6.1 billion in labor income (wages, salaries, sole proprietorship/partnership profits); and $1.9 billion in capital income (rents, interest payments, corporate dividend payments). If the Medicaid program were shut down and the funds returned to taxpayers who saved/spent the funds according to typical consumer expenditure patterns, employment in North Carolina would fall by an estimated 67,400 jobs, and labor income would fall by $2.83 billion, due to the labor-intensive nature of Medicaid expenditures. Medicaid expenditure and economic impact results do not capture the economic value of the improved health and well-being of Medicaid recipients. Furthermore, the results do not capture the savings to society from increased preventive care and reduced uncompensated care resulting from Medicaid. State and local government expenditures do not fully capture the economic consequences of Medicaid in North Carolina. This study finds that Medicaid makes a large contribution to state and local economic activity by creating jobs, income, and profit in North Carolina. Any changes to the Medicaid program should be made with caution. The rising costs of health care and the appropriate role of government health insurance programs are the object of current policy debates. Informed discussion of these issues requires good information on the economic and health consequences of alternative policy choices. This is the first systematic study of the broader economic impacts of Medicaid expenditures in North Carolina.
The elements of design knowledge capture
NASA Technical Reports Server (NTRS)
Freeman, Michael S.
1988-01-01
This paper will present the basic constituents of a design knowledge capture effort. This will include a discussion of the types of knowledge to be captured in such an effort and the difference between design knowledge capture and more traditional knowledge base construction. These differences include both knowledge base structure and knowledge acquisition approach. The motivation for establishing a design knowledge capture effort as an integral part of major NASA programs will be outlined, along with the current NASA position on that subject. Finally the approach taken in design knowledge capture for Space Station will be contrasted with that used in the HSTDEK project.
ERIC Educational Resources Information Center
Vassar, Penny; Havice, Pamela A.; Havice, William L.; Brookover, Robert, IV
2015-01-01
Lecture capture technology allows instructors to record presentations and make them available to their students digitally. This study examined one program's implementation of lecture capture. Participants were undergraduate college students enrolled in Parks, Recreation, and Tourism Management courses at a public land grant university in the…
A Learning Theory Conceptual Foundation for Using Capture Technology in Teaching
ERIC Educational Resources Information Center
Berardi, Victor; Blundell, Greg
2014-01-01
Lecture capture technologies are increasingly being used by instructors, programs, and institutions to deliver online lectures and courses. This lecture capture movement is important as it increases access to education opportunities that were not possible before, it can improve efficiency, and it can increase student engagement. However, this is…
Program theory-driven evaluation science in a youth development context.
Deane, Kelsey L; Harré, Niki
2014-08-01
Program theory-driven evaluation science (PTDES) provides a useful framework for uncovering the mechanisms responsible for positive change resulting from participation in youth development (YD) programs. Yet it is difficult to find examples of PTDES that capture the complexity of such experiences. This article offers a much-needed example of PTDES applied to Project K, a youth development program with adventure, service-learning and mentoring components. Findings from eight program staff focus groups, 351 youth participants' comments, four key program documents, and results from six previous Project K research projects were integrated to produce a theory of change for the program. A direct logic analysis was then conducted to assess the plausibility of the proposed theory against relevant research literature. This demonstrated that Project K incorporates many of the best practice principles discussed in the literature that covers the three components of the program. The contributions of this theory-building process to organizational learning and development are discussed. Copyright © 2014 Elsevier Ltd. All rights reserved.
MPCV Exercise Operational Volume Analysis
NASA Technical Reports Server (NTRS)
Godfrey, A.; Humphreys, B.; Funk, J.; Perusek, G.; Lewandowski, B. E.
2017-01-01
In order to minimize the loss of bone and muscle mass during spaceflight, the Multi-purpose Crew Vehicle (MPCV) will include an exercise device and enough free space within the cabin for astronauts to use the device effectively. The NASA Digital Astronaut Project (DAP) has been tasked with using computational modeling to aid in determining whether or not the available operational volume is sufficient for in-flight exercise.Motion capture data was acquired using a 12-camera Smart DX system (BTS Bioengineering, Brooklyn, NY), while exercisers performed 9 resistive exercises without volume restrictions in a 1g environment. Data were collected from two male subjects, one being in the 99th percentile of height and the other in the 50th percentile of height, using between 25 and 60 motion capture markers. Motion capture data was also recorded as a third subject, also near the 50th percentile in height, performed aerobic rowing during a parabolic flight. A motion capture system and algorithms developed previously and presented at last years HRP-IWS were utilized to collect and process the data from the parabolic flight [1]. These motions were applied to a scaled version of a biomechanical model within the biomechanical modeling software OpenSim [2], and the volume sweeps of the motions were visually assessed against an imported CAD model of the operational volume. Further numerical analysis was performed using Matlab (Mathworks, Natick, MA) and the OpenSim API. This analysis determined the location of every marker in space over the duration of the exercise motion, and the distance of each marker to the nearest surface of the volume. Containment of the exercise motions within the operational volume was determined on a per-exercise and per-subject basis. The orientation of the exerciser and the angle of the footplate were two important factors upon which containment was dependent. Regions where the exercise motion exceeds the bounds of the operational volume have been identified by determining which markers from the motion capture exceed the operational volume and by how much. A credibility assessment of this analysis was performed in accordance with NASA-STD-7009 prior to delivery to the MPCV program.
ERIC Educational Resources Information Center
Revell, Kevin D.
2014-01-01
Three emerging technologies were used in a large introductory chemistry class: a tablet PC, a lecture capture and replay software program, and an online homework program. At the end of the semester, student usage of the lecture replay and online homework systems was compared to course performance as measured by course grade and by a standardized…
Beyond supervised learning: A multi-perspective approach to outpatient physical therapy mentoring.
Buning, Megan M; Buning, Shaun W
2018-02-23
Novice physical therapists face multiple challenges as they transition to autonomous, efficient, and seasoned therapists. Mentoring is known to facilitate growth among novice therapists; however, formalized mentoring programs within the outpatient setting are scarce or management-centered. This study sought to explore the most desired components of a formal mentoring program from multiple perspectives. An inductive qualitative inquiry explored perceptions of participants (n = 35) from four populations. Interviews were conducted with students (n = 5) and novice therapists (n = 5), and survey data was collected from faculty (n = 7) and expert therapists (n = 18). Thematic content analysis was used for data analysis. Three primary themes emerged as program emphasis: 1) Program function; 2) novice therapists' needs; and 3) the making of a mentorship (including mentor/mentee characteristics and matching strategy). This study captured multiple perspectives as to the components of interest in development of a formalized mentoring program for novice therapists in the outpatient setting. As the profession continues to emphasize standards for guided learning, steps must be taken by individual employers to promote and facilitate the most effective practices. Findings provide depth and suggestions for developing an outpatient-mentoring program.
Runtime Verification of C Programs
NASA Technical Reports Server (NTRS)
Havelund, Klaus
2008-01-01
We present in this paper a framework, RMOR, for monitoring the execution of C programs against state machines, expressed in a textual (nongraphical) format in files separate from the program. The state machine language has been inspired by a graphical state machine language RCAT recently developed at the Jet Propulsion Laboratory, as an alternative to using Linear Temporal Logic (LTL) for requirements capture. Transitions between states are labeled with abstract event names and Boolean expressions over such. The abstract events are connected to code fragments using an aspect-oriented pointcut language similar to ASPECTJ's or ASPECTC's pointcut language. The system is implemented in the C analysis and transformation package CIL, and is programmed in OCAML, the implementation language of CIL. The work is closely related to the notion of stateful aspects within aspect-oriented programming, where pointcut languages are extended with temporal assertions over the execution trace.
Fasciola hepatica in goats from north-western Spain: Risk factor analysis using a capture ELISA.
Pérez-Creo, Ana; Díaz, Pablo; López, Ceferino; Béjar, Juan Pablo; Martínez-Sernández, Victoria; Panadero, Rosario; Díez-Baños, Pablo; Ubeira, Florencio M; Morrondo, Patrocinio
2016-02-01
In order to study the seroprevalence of Fasciola hepatica infection in goats from north-western Spain, a total of 603 serum samples from 47 herds were tested using a capture ELISA (MM3-SERO). The identification of risk factors was assessed by a mixed-effects logistic regression analysis. The results showed that F. hepatica is widespread in this area with 57.4% of the herds and 22.7% of the animals testing positive. Breed and age were identified as determining factors for caprine F. hepatica infection. Seroprevalence in cross-bred animals was significantly higher than in the autochthonous Cabra Galega breed. A significantly higher seroprevalence was observed in older animals. The use of locally adapted breeds and the implementation of suitable management practices could provide a substantial improvement over the current F. hepatica control measures carried out in goat herds and should be considered when designing new F. hepatica control programs. Copyright © 2015 Elsevier Ltd. All rights reserved.
Tracking and Quantifying Developmental Processes in C. elegans Using Open-source Tools.
Dutta, Priyanka; Lehmann, Christina; Odedra, Devang; Singh, Deepika; Pohl, Christian
2015-12-16
Quantitatively capturing developmental processes is crucial to derive mechanistic models and key to identify and describe mutant phenotypes. Here protocols are presented for preparing embryos and adult C. elegans animals for short- and long-term time-lapse microscopy and methods for tracking and quantification of developmental processes. The methods presented are all based on C. elegans strains available from the Caenorhabditis Genetics Center and on open-source software that can be easily implemented in any laboratory independently of the microscopy system used. A reconstruction of a 3D cell-shape model using the modelling software IMOD, manual tracking of fluorescently-labeled subcellular structures using the multi-purpose image analysis program Endrov, and an analysis of cortical contractile flow using PIVlab (Time-Resolved Digital Particle Image Velocimetry Tool for MATLAB) are shown. It is discussed how these methods can also be deployed to quantitatively capture other developmental processes in different models, e.g., cell tracking and lineage tracing, tracking of vesicle flow.
NEW NEUTRON-CAPTURE MEASUREMENTS IN 23 OPEN CLUSTERS. I. THE r -PROCESS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Overbeek, Jamie C.; Friel, Eileen D.; Jacobson, Heather R., E-mail: joverbee@indiana.edu
2016-06-20
Neutron-capture elements, those with Z > 35, are the least well understood in terms of nucleosynthesis and formation environments. The rapid neutron-capture, or r -process, elements are formed in the environments and/or remnants of massive stars, while the slow neutron-capture, or s -process, elements are primarily formed in low-mass AGB stars. These elements can provide much information about Galactic star formation and enrichment, but observational data are limited. We have assembled a sample of 68 stars in 23 open clusters that we use to probe abundance trends for six neutron-capture elements (Eu, Gd, Dy, Mo, Pr, and Nd) with clustermore » age and location in the disk of the Galaxy. In order to keep our analysis as homogeneous as possible, we use an automated synthesis fitting program, which also enables us to measure multiple (3–10) lines for each element. We find that the pure r -process elements (Eu, Gd, and Dy) have positive trends with increasing cluster age, while the mixed r - and s -process elements (Mo, Pr, and Nd) have insignificant trends consistent with zero. Pr, Nd, Eu, Gd, and Dy have similar, slight (although mostly statistically significant) gradients of ∼0.04 dex kpc{sup −1}. The mixed elements also appear to have nonlinear relationships with R {sub GC}.« less
Peterfreund, Robert A; Driscoll, William D; Walsh, John L; Subramanian, Aparna; Anupama, Shaji; Weaver, Melissa; Morris, Theresa; Arnholz, Sarah; Zheng, Hui; Pierce, Eric T; Spring, Stephen F
2011-05-01
Efforts to assure high-quality, safe, clinical care depend upon capturing information about near-miss and adverse outcome events. Inconsistent or unreliable information capture, especially for infrequent events, compromises attempts to analyze events in quantitative terms, understand their implications, and assess corrective efforts. To enhance reporting, we developed a secure, electronic, mandatory system for reporting quality assurance data linked to our electronic anesthesia record. We used the capabilities of our anesthesia information management system (AIMS) in conjunction with internally developed, secure, intranet-based, Web application software. The application is implemented with a backend allowing robust data storage, retrieval, data analysis, and reporting capabilities. We customized a feature within the AIMS software to create a hard stop in the documentation workflow before the end of anesthesia care time stamp for every case. The software forces the anesthesia provider to access the separate quality assurance data collection program, which provides a checklist for targeted clinical events and a free text option. After completing the event collection program, the software automatically returns the clinician to the AIMS to finalize the anesthesia record. The number of events captured by the departmental quality assurance office increased by 92% (95% confidence interval [CI] 60.4%-130%) after system implementation. The major contributor to this increase was the new electronic system. This increase has been sustained over the initial 12 full months after implementation. Under our reporting criteria, the overall rate of clinical events reported by any method was 471 events out of 55,382 cases or 0.85% (95% CI 0.78% to 0.93%). The new system collected 67% of these events (95% confidence interval 63%-71%). We demonstrate the implementation in an academic anesthesia department of a secure clinical event reporting system linked to an AIMS. The system enforces entry of quality assurance information (either no clinical event or notification of a clinical event). System implementation resulted in capturing nearly twice the number of events at a relatively steady case load. © 2011 International Anesthesia Research Society
A farm-level precision land management framework based on integer programming
Li, Qi; Hu, Guiping; Jubery, Talukder Zaki; Ganapathysubramanian, Baskar
2017-01-01
Farmland management involves several planning and decision making tasks including seed selection and irrigation management. A farm-level precision farmland management model based on mixed integer linear programming is proposed in this study. Optimal decisions are designed for pre-season planning of crops and irrigation water allocation. The model captures the effect of size and shape of decision scale as well as special irrigation patterns. The authors illustrate the model with a case study on a farm in the state of California in the U.S. and show the model can capture the impact of precision farm management on profitability. The results show that threefold increase of annual net profit for farmers could be achieved by carefully choosing irrigation and seed selection. Although farmers could increase profits by applying precision management to seed or irrigation alone, profit increase is more significant if farmers apply precision management on seed and irrigation simultaneously. The proposed model can also serve as a risk analysis tool for farmers facing seasonal irrigation water limits as well as a quantitative tool to explore the impact of precision agriculture. PMID:28346499
Improving The Near-Earth Meteoroid And Orbital Debris Environment Definition With LAD-C
NASA Technical Reports Server (NTRS)
Liou, J.-C.; Giovane, F. J.; Corsaro, R. C.; Burchell, M. J.; Drolshagen, G.; Kawai, H.; Tabata, M.; Stansbery, E. G.; Westphal, A. J.; Yano, H.
2006-01-01
To improve the near-Earth meteoroid and orbital debris environment definition, a large area particle sensor/collector is being developed to be placed on the International Space Station (ISS). This instrument, the Large Area Debris Collector (LAD-C), will attempt to record meteoroid and orbital debris impact flux, and capture the same particles with aerogel. After at least one year of deployment, the whole system will be brought back for additional laboratory analysis of the captured meteoroids and orbital debris. This project is led by the U.S. Naval Research Laboratory (NRL) while the U.S. Department of Defense (DoD) Space Test Program (STP) is responsible for the integration, deployment, and retrieval of the system. Additional contributing team members of the consortium include the NASA Orbital Debris Program Office, JAXA Institute of Space and Astronautical Science (ISAS), Chiba University (Japan), ESA Space Debris Office, University of Kent (UK), and University of California at Berkeley. The deployment of LAD-C on the ISS is planned for 2008, with the system retrieval in late 2009.
INEL BNCT Research Program Annual Report 1993
DOE Office of Scientific and Technical Information (OSTI.GOV)
Venhuizen, J.R.
1994-08-01
This report is a summary of the progress and research produced for the Idaho National Engineering Laboratory Boron Neutron Capture Therapy Research Program for calendar year 1993. Contributions from all the principal investigators are included, covering chemistry (pituitary tumor studies, boron drug development including liposomes, lipoproteins, and carboranylalanine derivatives), pharmacology (murine screenings, toxicity testing, boron drug analysis), physics (radiation dosimetry software, neutron beam and filter design, neutron beam measurement dosimetry), and radiation biology (tissue and efficacy studies of small and large animal models). Information on the potential toxicity of borocaptate sodium and boronophenylalanine is presented. Results of 21 spontaneous-tumor-bearing dogsmore » that have been treated with boron neutron capture therapy at the Brookhaven National Laboratory are updated. Boron-containing drug purity verification is discussed in some detail. Advances in magnetic resonance imaging of boron in vivo are discussed. Several boron-carrying drugs exhibiting good tumor uptake are described. Significant progress in the potential of treating pituitary tumors is presented. Measurement of the epithermal-neutron flux of the Petten (The Netherlands) High Flux Reactor beam (HFB11B), and comparison to predictions are shown.« less
Warren, Carol; Visser, Leontine
The local turn in good governance theory and practice responded to critiques of the ineffectiveness of state management and the inequity of privatization alternatives in natural resource management. Confounding expectations of greater effectiveness from decentralised governance, including community-based natural resource management, however, critics argue that expanded opportunities for elite capture have become widely associated with program failures. This overview of theoretical controversies on leadership, patronage and elite capture is part of a themed section in this issue that challenges assumptions across a wide range of current policy literature. It introduces a set of Indonesian case studies that examine practices of local leaders and elites and seek to account in structural terms for appropriations both by ('elite capture') and of ('captured elites') these key figures. These studies explore the structural factors and co-governance practices most likely to promote effective participation of the full spectrum of local interests in pursuit of better local natural resource governance.
A Process for Capturing the Art of Systems Engineering
NASA Technical Reports Server (NTRS)
Owens, Clark V., III; Sekeres, Carrie; Roumie, Yasmeen
2016-01-01
There is both an art and a science to systems engineering. The science of systems engineering is effectively captured in processes and procedures, but the art is much more elusive. We propose that there is six step process that can be applied to any systems engineering organization to create an environment from which the "art" of that organization can be captured, be allowed to evolve collaboratively and be shared with all members of the organization. This paper details this process as it was applied to NASA Launch Services Program (LSP) Integration Engineering Branch during a pilot program of Confluence, a Commercial Off The Shelf (COTS) wiki tool.
A collapsible trap for capturing ruffe
Edwards, Andrew J.; Czypinski, Gary D.; Selgeby, James H.
1998-01-01
A modified version of the Windermere trap was designed, constructed, and tested for its effectiveness in capturing ruffe Gymnocephalus cernuus. The inexpensive, lightweight, collapsible trap was easily deployed and retrieved from a small boat. Field tests conducted at the St. Louis River estuary in western Lake Superior in spring 1995 and 1996 indicated that the trap was effective in capturing ruffe. Proportions of the ruffe in trap and bottom trawl catches were similar in 1995 and 1996. This trap could be a useful tool in surveillance, monitoring, or control programs for ruffe or similar species, either to augment existing sampling programs or especially in situations where gillnetting or bottom trawling are not feasible.
MSFC Space Station Program Commonly Used Acronyms and Abbreviations Listing
NASA Technical Reports Server (NTRS)
Gates, Thomas G.
1988-01-01
The Marshall Space Flight Center maintains an active history program to assure that the foundation of the Center's history is captured and preserved for current and future generations. As part of that overall effort, the Center began a project in 1987 to capture historical information and documentation on the Marshall Center's roles regarding Space Shuttle and Space Station. This document is MSFC Space Station Program Commonly Used Acronyms and Abbreviations Listing. It contains acronyms and abbreviations used in Space Station documentation and in the Historian Annotated Bibliography of Space Station Program. The information may be used by the researcher as a reference tool.
Parsley, Michael J.; Kofoot, Eric
2013-01-01
Wild-spawned white sturgeon (Acipenser transmontanus) larvae captured and reared in aquaculture facilities and subsequently released, are increasingly being used in sturgeon restoration programs in the Columbia River Basin. A reconnaissance study was conducted to determine where to deploy nets to capture white sturgeon larvae downstream of a known white sturgeon spawning area. As a result of the study, 103 white sturgeon larvae and 5 newly hatched free-swimming embryos were captured at 3 of 5 reconnaissance netting sites. The netting, conducted downstream of The Dalles Dam on the Columbia River during June 25–29, 2012, provided information for potentially implementing full-scale collection efforts of large numbers of larvae for rearing in aquaculture facilities and for subsequent release at a larger size in white sturgeon restoration programs.
Design knowledge capture for a corporate memory facility
NASA Technical Reports Server (NTRS)
Boose, John H.; Shema, David B.; Bradshaw, Jeffrey M.
1990-01-01
Currently, much of the information regarding decision alternatives and trade-offs made in the course of a major program development effort is not represented or retained in a way that permits computer-based reasoning over the life cycle of the program. The loss of this information results in problems in tracing design alternatives to requirements, in assessing the impact of change in requirements, and in configuration management. To address these problems, the problem was studied of building an intelligent, active corporate memory facility which would provide for the capture of the requirements and standards of a program, analyze the design alternatives and trade-offs made over the program's lifetime, and examine relationships between requirements and design trade-offs. Early phases of the work have concentrated on design knowledge capture for the Space Station Freedom. Tools are demonstrated and extended which helps automate and document engineering trade studies, and another tool is being developed to help designers interactively explore design alternatives and constraints.
NASA Technical Reports Server (NTRS)
Mann, F. I.; Horsewood, J. L.
1974-01-01
A performance-analysis computer program, that was developed explicitly to generate optimum electric propulsion trajectory data for missions of interest in the exploration of the solar system is presented. The program was primarily designed to evaluate the performance capabilities of electric propulsion systems, and in the simulation of a wide variety of interplanetary missions. A numerical integration of the two-body, three-dimensional equations of motion and the Euler-Lagrange equations was used in the program. Transversality conditions which permit the rapid generation of converged maximum-payload trajectory data, and the optimization of numerous other performance indices for which no transversality conditions exist are included. The ability to simulate constrained optimum solutions, including trajectories having specified propulsion time and constant thrust cone angle, is also in the program. The program was designed to handle multiple-target missions with various types of encounters, such as rendezvous, stopover, orbital capture, and flyby. Performance requirements for a variety of launch vehicles can be determined.
ERIC Educational Resources Information Center
Sexton, Steven S.; Williamson-Leadley, Sandra
2017-01-01
This article reports on a study of how a 1-year, course-taught, master's level initial teacher education (ITE) program challenged primary student teachers (n = 4) in developing their sense of self-as-teacher. This study examined how the program's incorporation of video capturing technology impacted on these student teachers' development of…
Critical thinking of registered nurses in a fellowship program.
Zori, Susan; Kohn, Nina; Gallo, Kathleen; Friedman, M Isabel
2013-08-01
Critical thinking is essential to nursing practice. This study examined differences in the critical thinking dispositions of registered nurses (RNs) in a nursing fellowship program. Control and experimental groups were used to compare differences in scores on the California Critical Thinking Disposition Inventory (CCTDI) of RNs at three points during a fellowship program: baseline, week 7, and month 5. The control group consisted of RNs who received no education in critical thinking. The experimental group received education in critical thinking using simulated scenarios and reflective journaling. CCTDI scores examined with analysis of variance showed no significant difference within groups over time or between groups. The baseline scores of the experimental group were slightly higher than those of the control group. Chi-square analysis of demographic variables between the two groups showed no significant differences. Critical thinking dispositions are a combination of attitudes, values, and beliefs that make up one's personality based on life experience. Lack of statistical significance using a quantitative approach did not capture the development of the critical thinking dispositions of participants. A secondary qualitative analysis of journal entries is being conducted. Copyright 2013, SLACK Incorporated.
NASA Technical Reports Server (NTRS)
Searcy, Brittani
2017-01-01
Using virtual environments to assess complex large scale human tasks provides timely and cost effective results to evaluate designs and to reduce operational risks during assembly and integration of the Space Launch System (SLS). NASA's Marshall Space Flight Center (MSFC) uses a suite of tools to conduct integrated virtual analysis during the design phase of the SLS Program. Siemens Jack is a simulation tool that allows engineers to analyze human interaction with CAD designs by placing a digital human model into the environment to test different scenarios and assess the design's compliance to human factors requirements. Engineers at MSFC are using Jack in conjunction with motion capture and virtual reality systems in MSFC's Virtual Environments Lab (VEL). The VEL provides additional capability beyond standalone Jack to record and analyze a person performing a planned task to assemble the SLS at Kennedy Space Center (KSC). The VEL integrates Vicon Blade motion capture system, Siemens Jack, Oculus Rift, and other virtual tools to perform human factors assessments. By using motion capture and virtual reality, a more accurate breakdown and understanding of how an operator will perform a task can be gained. By virtual analysis, engineers are able to determine if a specific task is capable of being safely performed by both a 5% (approx. 5ft) female and a 95% (approx. 6'1) male. In addition, the analysis will help identify any tools or other accommodations that may to help complete the task. These assessments are critical for the safety of ground support engineers and keeping launch operations on schedule. Motion capture allows engineers to save and examine human movements on a frame by frame basis, while virtual reality gives the actor (person performing a task in the VEL) an immersive view of the task environment. This presentation will discuss the need of human factors for SLS and the benefits of analyzing tasks in NASA MSFC's VEL.
IMCS reflight certification requirements and design specifications
NASA Technical Reports Server (NTRS)
1984-01-01
The requirements for reflight certification are established. Software requirements encompass the software programs that are resident in the PCC, DEP, PDSS, EC, or any related GSE. A design approach for the reflight software packages is recommended. These designs will be of sufficient detail to permit the implementation of reflight software. The PDSS/IMC Reflight Certification system provides the tools and mechanisms for the user to perform the reflight certification test procedures, test data capture, test data display, and test data analysis. The system as defined will be structured to permit maximum automation of reflight certification procedures and test data analysis.
Statistical inference for capture-recapture experiments
Pollock, Kenneth H.; Nichols, James D.; Brownie, Cavell; Hines, James E.
1990-01-01
This monograph presents a detailed, practical exposition on the design, analysis, and interpretation of capture-recapture studies. The Lincoln-Petersen model (Chapter 2) and the closed population models (Chapter 3) are presented only briefly because these models have been covered in detail elsewhere. The Jolly- Seber open population model, which is central to the monograph, is covered in detail in Chapter 4. In Chapter 5 we consider the "enumeration" or "calendar of captures" approach, which is widely used by mammalogists and other vertebrate ecologists. We strongly recommend that it be abandoned in favor of analyses based on the Jolly-Seber model. We consider 2 restricted versions of the Jolly-Seber model. We believe the first of these, which allows losses (mortality or emigration) but not additions (births or immigration), is likely to be useful in practice. Another series of restrictive models requires the assumptions of a constant survival rate or a constant survival rate and a constant capture rate for the duration of the study. Detailed examples are given that illustrate the usefulness of these restrictions. There often can be a substantial gain in precision over Jolly-Seber estimates. In Chapter 5 we also consider 2 generalizations of the Jolly-Seber model. The temporary trap response model allows newly marked animals to have different survival and capture rates for 1 period. The other generalization is the cohort Jolly-Seber model. Ideally all animals would be marked as young, and age effects considered by using the Jolly-Seber model on each cohort separately. In Chapter 6 we present a detailed description of an age-dependent Jolly-Seber model, which can be used when 2 or more identifiable age classes are marked. In Chapter 7 we present a detailed description of the "robust" design. Under this design each primary period contains several secondary sampling periods. We propose an estimation procedure based on closed and open population models that allows for heterogeneity and trap response of capture rates (hence the name robust design). We begin by considering just 1 age class and then extend to 2 age classes. When there are 2 age classes it is possible to distinguish immigrants and births. In Chapter 8 we give a detailed discussion of the design of capture-recapture studies. First, capture-recapture is compared to other possible sampling procedures. Next, the design of capture-recapture studies to minimize assumption violations is considered. Finally, we consider the precision of parameter estimates and present figures on proportional standard errors for a variety of initial parameter values to aid the biologist about to plan a study. A new program, JOLLY, has been written to accompany the material on the Jolly-Seber model (Chapter 4) and its extensions (Chapter 5). Another new program, JOLLYAGE, has been written for a special case of the age-dependent model (Chapter 6) where there are only 2 age classes. In Chapter 9 a brief description of the different versions of the 2 programs is given. Chapter 10 gives a brief description of some alternative approaches that were not considered in this monograph. We believe that an excellent overall view of capture- recapture models may be obtained by reading the monograph by White et al. (1982) emphasizing closed models and then reading this monograph where we concentrate on open models. The important recent monograph by Burnham et al. (1987) could then be read if there were interest in the comparison of different populations.
Arthropod Surveillance Programs: Basic Components, Strategies, and Analysis.
Cohnstaedt, Lee W; Rochon, Kateryn; Duehl, Adrian J; Anderson, John F; Barrera, Roberto; Su, Nan-Yao; Gerry, Alec C; Obenauer, Peter J; Campbell, James F; Lysyk, Tim J; Allan, Sandra A
2012-03-01
Effective entomological surveillance planning stresses a careful consideration of methodology, trapping technologies, and analysis techniques. Herein, the basic principles and technological components of arthropod surveillance plans are described, as promoted in the symposium "Advancements in arthropod monitoring technology, techniques, and analysis" presented at the 58th annual meeting of the Entomological Society of America in San Diego, CA. Interdisciplinary examples of arthropod monitoring for urban, medical, and veterinary applications are reviewed. Arthropod surveillance consists of the three components: 1) sampling method, 2) trap technology, and 3) analysis technique. A sampling method consists of selecting the best device or collection technique for a specific location and sampling at the proper spatial distribution, optimal duration, and frequency to achieve the surveillance objective. Optimized sampling methods are discussed for several mosquito species (Diptera: Culicidae) and ticks (Acari: Ixodidae). The advantages and limitations of novel terrestrial and aerial insect traps, artificial pheromones and kairomones are presented for the capture of red flour beetle (Coleoptera: Tenebrionidae), small hive beetle (Coleoptera: Nitidulidae), bed bugs (Hemiptera: Cimicidae), and Culicoides (Diptera: Ceratopogonidae) respectively. After sampling, extrapolating real world population numbers from trap capture data are possible with the appropriate analysis techniques. Examples of this extrapolation and action thresholds are given for termites (Isoptera: Rhinotermitidae) and red flour beetles.
FBI Fingerprint Image Capture System High-Speed-Front-End throughput modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rathke, P.M.
1993-09-01
The Federal Bureau of Investigation (FBI) has undertaken a major modernization effort called the Integrated Automated Fingerprint Identification System (IAFISS). This system will provide centralized identification services using automated fingerprint, subject descriptor, mugshot, and document processing. A high-speed Fingerprint Image Capture System (FICS) is under development as part of the IAFIS program. The FICS will capture digital and microfilm images of FBI fingerprint cards for input into a central database. One FICS design supports two front-end scanning subsystems, known as the High-Speed-Front-End (HSFE) and Low-Speed-Front-End, to supply image data to a common data processing subsystem. The production rate of themore » HSFE is critical to meeting the FBI`s fingerprint card processing schedule. A model of the HSFE has been developed to help identify the issues driving the production rate, assist in the development of component specifications, and guide the evolution of an operations plan. A description of the model development is given, the assumptions are presented, and some HSFE throughput analysis is performed.« less
Nugen, Sam R; Leonard, Barbara; Baeumner, Antje J
2007-05-15
We developed a software program for the rapid selection of detection probes to be used in nucleic acid-based assays. In comparison to commercially available software packages, our program allows the addition of oligotags as required by nucleic acid sequence-based amplification (NASBA) as well as automatic BLAST searches for all probe/primer pairs. We then demonstrated the usefulness of the program by designing a novel lateral flow biosensor for Streptococcus pyogenes that does not rely on amplification methods such as the polymerase chain reaction (PCR) or NASBA to obtain low limits of detection, but instead uses multiple reporter and capture probes per target sequence and an instantaneous amplification via dye-encapsulating liposomes. These assays will decrease the detection time to just a 20 min hybridization reaction and avoid costly enzymatic gene amplification reactions. The lateral flow assay was developed quantifying the 16S rRNA from S. pyogenes by designing reporter and capture probes that specifically hybridize with the RNA and form a sandwich. DNA reporter probes were tagged with dye-encapsulating liposomes, biotinylated DNA oligonucleotides were used as capture probes. From the initial number of capture and reporter probes chosen, a combination of two capture and three reporter probes were found to provide optimal signal generation and significant enhancement over single capture/reporter probe combinations. The selectivity of the biosensor was proven by analyzing organisms closely related to S. pyogenes, such as other Streptococcus and Enterococcus species. All probes had been selected by the software program within minutes and no iterative optimization and re-design of the oligonucleotides was required which enabled a very rapid biosensor prototyping. While the sensitivity obtained with the biosensor was only 135 ng, future experiments will decrease this significantly by the addition of more reporter and capture probes for either the same rRNA or a different nucleic acid target molecule. This will lead to the possibility of detecting S. pyogenes with a rugged assay that does not require a cell culturing or gene amplification step and will therefore enable rapid, specific and sensitive onsite testing.
Effects of sampling conditions on DNA-based estimates of American black bear abundance
Laufenberg, Jared S.; Van Manen, Frank T.; Clark, Joseph D.
2013-01-01
DNA-based capture-mark-recapture techniques are commonly used to estimate American black bear (Ursus americanus) population abundance (N). Although the technique is well established, many questions remain regarding study design. In particular, relationships among N, capture probability of heterogeneity mixtures A and B (pA and pB, respectively, or p, collectively), the proportion of each mixture (π), number of capture occasions (k), and probability of obtaining reliable estimates of N are not fully understood. We investigated these relationships using 1) an empirical dataset of DNA samples for which true N was unknown and 2) simulated datasets with known properties that represented a broader array of sampling conditions. For the empirical data analysis, we used the full closed population with heterogeneity data type in Program MARK to estimate N for a black bear population in Great Smoky Mountains National Park, Tennessee. We systematically reduced the number of those samples used in the analysis to evaluate the effect that changes in capture probabilities may have on parameter estimates. Model-averaged N for females and males were 161 (95% CI = 114–272) and 100 (95% CI = 74–167), respectively (pooled N = 261, 95% CI = 192–419), and the average weekly p was 0.09 for females and 0.12 for males. When we reduced the number of samples of the empirical data, support for heterogeneity models decreased. For the simulation analysis, we generated capture data with individual heterogeneity covering a range of sampling conditions commonly encountered in DNA-based capture-mark-recapture studies and examined the relationships between those conditions and accuracy (i.e., probability of obtaining an estimated N that is within 20% of true N), coverage (i.e., probability that 95% confidence interval includes true N), and precision (i.e., probability of obtaining a coefficient of variation ≤20%) of estimates using logistic regression. The capture probability for the larger of 2 mixture proportions of the population (i.e., pA or pB, depending on the value of π) was most important for predicting accuracy and precision, whereas capture probabilities of both mixture proportions (pA and pB) were important to explain variation in coverage. Based on sampling conditions similar to parameter estimates from the empirical dataset (pA = 0.30, pB = 0.05, N = 250, π = 0.15, and k = 10), predicted accuracy and precision were low (60% and 53%, respectively), whereas coverage was high (94%). Increasing pB, the capture probability for the predominate but most difficult to capture proportion of the population, was most effective to improve accuracy under those conditions. However, manipulation of other parameters may be more effective under different conditions. In general, the probabilities of obtaining accurate and precise estimates were best when p≥ 0.2. Our regression models can be used by managers to evaluate specific sampling scenarios and guide development of sampling frameworks or to assess reliability of DNA-based capture-mark-recapture studies.
Model authoring system for fail safe analysis
NASA Technical Reports Server (NTRS)
Sikora, Scott E.
1990-01-01
The Model Authoring System is a prototype software application for generating fault tree analyses and failure mode and effects analyses for circuit designs. Utilizing established artificial intelligence and expert system techniques, the circuits are modeled as a frame-based knowledge base in an expert system shell, which allows the use of object oriented programming and an inference engine. The behavior of the circuit is then captured through IF-THEN rules, which then are searched to generate either a graphical fault tree analysis or failure modes and effects analysis. Sophisticated authoring techniques allow the circuit to be easily modeled, permit its behavior to be quickly defined, and provide abstraction features to deal with complexity.
Neutron capture therapy: Years of experimentation---Years of reflection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Farr, L.E.
1991-12-16
This report describes early research on neutron capture therapy over a number of years, beginning in 1950, speaking briefly of patient treatments but dwelling mostly on interpretations of our animal experiments. This work carried out over eighteen years, beginning over forty years ago. Yet, it is only fitting to start by relating how neutron capture therapy became part of Brookhaven`s Medical Research Center program.
Neutron capture therapy: Years of experimentation---Years of reflection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Farr, L.E.
1991-12-16
This report describes early research on neutron capture therapy over a number of years, beginning in 1950, speaking briefly of patient treatments but dwelling mostly on interpretations of our animal experiments. This work carried out over eighteen years, beginning over forty years ago. Yet, it is only fitting to start by relating how neutron capture therapy became part of Brookhaven's Medical Research Center program.
Rouco, Carlos; Moreno, Sacramento; Santoro, Simone
2016-10-01
Vaccination campaigns against myxomatosis and rabbit haemorrhagic disease (RHD) are commonly used in translocation programs conducted for the purpose of recovering wild European rabbit populations in Iberian Mediterranean ecosystems. In most cases rabbits are vaccinated 'blind' (i.e. without assessing their prior immunological status) for economic and logistic reasons. However, there is conflicting evidence on the effectiveness of such an approach. We tested whether blind vaccination against myxomatosis and rabbit haemorrhagic disease improved rabbit survival in a rabbit translocation program where wild rabbits were kept in semi-natural conditions in three enclosures. We conducted nine capture sessions over two years (2008-2010) and used the information collected to compare the survival of vaccinated (n=511) versus unvaccinated (n=161) adult wild rabbits using capture-mark-recapture analysis. Average monthly survival was no different for vaccinated versus unvaccinated individuals, both in the period between release and first capture (short-term) and after the first capture onward (long-term). Rabbit survival was lower in the short term than in the long term regardless of whether rabbits were vaccinated or not. Lower survival in the short-term could be due to the stress induced by the translocation process itself (e.g. handling stress). However, we did not find any overall effect of vaccination on survival which could be explained by two non-exclusive reasons. First, interference of the vaccine with the natural antibodies in the donor population. Due to donor populations have high density of rabbits with, likely, high prevalence of antibodies as a result of previous natural exposure to these diseases. Second, the lack of severe outbreaks during the study period. Based on our findings we argue that blind vaccination of adult rabbits in translocation programs may be often mostly ineffective and unnecessarily costly. In particular, since outbreaks are hard to predict and vaccination of rabbits with natural antibodies is ineffective, it is crucial to assess the immunological status of the donor population before translocating adult rabbits. Copyright © 2016 Elsevier B.V. All rights reserved.
Sim, John J; Batech, Michael; Danforth, Kim N; Rutkowski, Mark P; Jacobsen, Steven J; Kanter, Michael H
2017-01-01
Objectives: The Kaiser Permanente Southern California (KPSC) creatinine safety program (Creatinine SureNet) identifies and outreaches to thousands of people annually who may have had a missed diagnosis for chronic kidney disease (CKD). We sought to determine the value of this outpatient program and evaluate opportunities for improvement. Methods: Longitudinal cohort study (February 2010 through December 2015) of KPSC members captured into the creatinine safety program who were characterized using demographics, laboratory results, and different estimations of glomerular filtration rate. Age- and sex-adjusted rates of end-stage renal disease (ESRD) were compared with those in the overall KPSC population. Results: Among 12,394 individuals, 83 (0.7%) reached ESRD. The age- and sex-adjusted relative risk of ESRD was 2.7 times higher compared with the KPSC general population during the same period (94.7 vs 35.4 per 100,000 person-years; p < 0.001). Screening with the Chronic Kidney Disease Epidemiology Collaboration (vs Modification Diet in Renal Diseases) equation would capture 44% fewer individuals and have a higher predictive value for CKD. Of those who had repeated creatinine measurements, only 13% had a urine study performed (32% among patients with confirmed CKD). Conclusion: Our study found a higher incidence of ESRD among individuals captured into the KPSC creatinine safety program. If the Chronic Kidney Disease Epidemiology Collaboration equation were used, fewer people would have been captured while improving the accuracy for diagnosing CKD. Urine testing was low even among patients with confirmed CKD. Our findings demonstrate the importance of a creatinine safety net program in an integrated health system but also suggest opportunities to improve CKD care and screening. PMID:28241912
Sim, John J; Batech, Michael; Danforth, Kim N; Rutkowski, Mark P; Jacobsen, Steven J; Kanter, Michael H
2017-01-01
The Kaiser Permanente Southern California (KPSC) creatinine safety program (Creatinine SureNet) identifies and outreaches to thousands of people annually who may have had a missed diagnosis for chronic kidney disease (CKD). We sought to determine the value of this outpatient program and evaluate opportunities for improvement. Longitudinal cohort study (February 2010 through December 2015) of KPSC members captured into the creatinine safety program who were characterized using demographics, laboratory results, and different estimations of glomerular filtration rate. Age- and sex-adjusted rates of end-stage renal disease (ESRD) were compared with those in the overall KPSC population. Among 12,394 individuals, 83 (0.7%) reached ESRD. The age- and sex-adjusted relative risk of ESRD was 2.7 times higher compared with the KPSC general population during the same period (94.7 vs 35.4 per 100,000 person-years; p < 0.001). Screening with the Chronic Kidney Disease Epidemiology Collaboration (vs Modification Diet in Renal Diseases) equation would capture 44% fewer individuals and have a higher predictive value for CKD. Of those who had repeated creatinine measurements, only 13% had a urine study performed (32% among patients with confirmed CKD). Our study found a higher incidence of ESRD among individuals captured into the KPSC creatinine safety program. If the Chronic Kidney Disease Epidemiology Collaboration equation were used, fewer people would have been captured while improving the accuracy for diagnosing CKD. Urine testing was low even among patients with confirmed CKD. Our findings demonstrate the importance of a creatinine safety net program in an integrated health system but also suggest opportunities to improve CKD care and screening.
The financial impact of a clinical academic practice partnership.
Greene, Mary Ann; Turner, James
2014-01-01
New strategies to provide clinical experiences for nursing students have caused nursing schools and hospitals to evaluate program costs. A Microsoft Excel model, which captures costs and associated benefits, was developed and is described here. The financial analysis shows that the Clinical Academic Practice Program framework for nursing clinical education, often preferred by students, can offer financial advantages to participating hospitals and schools of nursing. The model is potentially a tool for schools of nursing to enlist hospitals and to help manage expenses of clinical education. Hospitals may also use the Hospital Nursing Unit Staffing and Expense Worksheet in planning staffing when students are assigned to units and the cost/benefit findings to enlist management support.
Chang, Sung-A; Lee, Sang-Chol; Kim, Eun-Young; Hahm, Seung-Hee; Jang, Shin Yi; Park, Sung-Ji; Choi, Jin-Oh; Park, Seung Woo; Choe, Yeon Hyeon; Oh, Jae K
2011-08-01
With recent developments in echocardiographic technology, a new system using real-time three-dimensional echocardiography (RT3DE) that allows single-beat acquisition of the entire volume of the left ventricle and incorporates algorithms for automated border detection has been introduced. Provided that these techniques are acceptably reliable, three-dimensional echocardiography may be much more useful for clinical practice. The aim of this study was to evaluate the feasibility and accuracy of left ventricular (LV) volume measurements by RT3DE using the single-beat full-volume capture technique. One hundred nine consecutive patients scheduled for cardiac magnetic resonance imaging and RT3DE using the single-beat full-volume capture technique on the same day were recruited. LV end-systolic volume, end-diastolic volume, and ejection fraction were measured using an auto-contouring algorithm from data acquired on RT3DE. The data were compared with the same measurements obtained using cardiac magnetic resonance imaging. Volume measurements on RT3DE with single-beat full-volume capture were feasible in 84% of patients. Both interobserver and intraobserver variability of three-dimensional measurements of end-systolic and end-diastolic volumes showed excellent agreement. Pearson's correlation analysis showed a close correlation of end-systolic and end-diastolic volumes between RT3DE and cardiac magnetic resonance imaging (r = 0.94 and r = 0.91, respectively, P < .0001 for both). Bland-Altman analysis showed reasonable limits of agreement. After application of the auto-contouring algorithm, the rate of successful auto-contouring (cases requiring minimal manual corrections) was <50%. RT3DE using single-beat full-volume capture is an easy and reliable technique to assess LV volume and systolic function in clinical practice. However, the image quality and low frame rate still limit its application for dilated left ventricles, and the automated volume analysis program needs more development to make it clinically efficacious. Copyright © 2011 American Society of Echocardiography. Published by Mosby, Inc. All rights reserved.
High-fidelity, low-cost, automated method to assess laparoscopic skills objectively.
Gray, Richard J; Kahol, Kanav; Islam, Gazi; Smith, Marshall; Chapital, Alyssa; Ferrara, John
2012-01-01
We sought to define the extent to which a motion analysis-based assessment system constructed with simple equipment could measure technical skill objectively and quantitatively. An "off-the-shelf" digital video system was used to capture the hand and instrument movement of surgical trainees (beginner level = PGY-1, intermediate level = PGY-3, and advanced level = PGY-5/fellows) while they performed a peg transfer exercise. The video data were passed through a custom computer vision algorithm that analyzed incoming pixels to measure movement smoothness objectively. The beginner-level group had the poorest performance, whereas those in the advanced group generated the highest scores. Intermediate-level trainees scored significantly (p < 0.04) better than beginner trainees. Advanced-level trainees scored significantly better than intermediate-level trainees and beginner-level trainees (p < 0.04 and p < 0.03, respectively). A computer vision-based analysis of surgical movements provides an objective basis for technical expertise-level analysis with construct validity. The technology to capture the data is simple, low cost, and readily available, and it obviates the need for expert human assessment in this setting. Copyright © 2012 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
Real-Time Data Capture and Management Evaluation and Performance Measures : Evaluation Framework
DOT National Transportation Integrated Search
2011-09-01
Through connected vehicle research, the U.S. DOT Intelligent Transportation Systems Joint Program Office (ITS JPO) is leading an effort to assess the potential for systematic and dynamic data capture from vehicles, travelers and the transportation sy...
DOT National Transportation Integrated Search
2012-11-01
The Connected Vehicle Mobility Standards Coordination Plan project links activities in three programs (Data Capture and Management, Dynamic Mobility Applications, and ITS Standards). The plan coordinates the timing, intent and relationship of activit...
Qualitative and quantitative reasoning about thermodynamics
NASA Technical Reports Server (NTRS)
Skorstad, Gordon; Forbus, Ken
1989-01-01
One goal of qualitative physics is to capture the tacit knowledge of engineers and scientists. It is shown how Qualitative Process theory can be used to express concepts of engineering thermodynamics. In particular, it is shown how to integrate qualitative and quantitative knowledge to solve textbook problems involving thermodynamic cycles, such as gas turbine plants and steam power plants. These ideas were implemented in a program called SCHISM. Its analysis of a sample textbook problem is described and plans for future work are discussed.
It was huge! Nursing students' first experience at AORN Congress.
Byrne, Michelle; Cantrell, Kelly; Fletcher, Daphne; McRaney, David; Morris, Kelly
2004-01-01
AN EXPERIENTIAL KNOWLEDGE of mentoring through nursing students' perspectives may enhance AORN's ability to recruit students to perioperative nursing and aid future planning for student involvement in the Association. IN 2003, four first-year nursing students attended the AORN Congress in Chicago with their nursing instructor and mentor. The students' experiences were captured using a thematic analysis to analyze their journals. THE FIVE COMMON THEMES identified were "it was huge," "exhibits," "student program," "exploring the city," and "suggestions for future planning."
Arthropod Surveillance Programs: Basic Components, Strategies, and Analysis
Rochon, Kateryn; Duehl, Adrian J.; Anderson, John F.; Barrera, Roberto; Su, Nan-Yao; Gerry, Alec C.; Obenauer, Peter J.; Campbell, James F.; Lysyk, Tim J.; Allan, Sandra A.
2015-01-01
Effective entomological surveillance planning stresses a careful consideration of methodology, trapping technologies, and analysis techniques. Herein, the basic principles and technological components of arthropod surveillance plans are described, as promoted in the symposium “Advancements in arthropod monitoring technology, techniques, and analysis” presented at the 58th annual meeting of the Entomological Society of America in San Diego, CA. Interdisciplinary examples of arthropod monitoring for urban, medical, and veterinary applications are reviewed. Arthropod surveillance consists of the three components: 1) sampling method, 2) trap technology, and 3) analysis technique. A sampling method consists of selecting the best device or collection technique for a specific location and sampling at the proper spatial distribution, optimal duration, and frequency to achieve the surveillance objective. Optimized sampling methods are discussed for several mosquito species (Diptera: Culicidae) and ticks (Acari: Ixodidae). The advantages and limitations of novel terrestrial and aerial insect traps, artificial pheromones and kairomones are presented for the capture of red flour beetle (Coleoptera: Tenebrionidae), small hive beetle (Coleoptera: Nitidulidae), bed bugs (Hemiptera: Cimicidae), and Culicoides (Diptera: Ceratopogonidae) respectively. After sampling, extrapolating real world population numbers from trap capture data are possible with the appropriate analysis techniques. Examples of this extrapolation and action thresholds are given for termites (Isoptera: Rhinotermitidae) and red flour beetles. PMID:26543242
Design knowledge capture for the space station
NASA Technical Reports Server (NTRS)
Crouse, K. R.; Wechsler, D. B.
1987-01-01
The benefits of design knowledge availability are identifiable and pervasive. The implementation of design knowledge capture and storage using current technology increases the probability for success, while providing for a degree of access compatibility with future applications. The space station design definition should be expanded to include design knowledge. Design knowledge should be captured. A critical timing relationship exists between the space station development program, and the implementation of this project.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alptekin, Gokhan; Jayaraman, Ambalavanan; Dietz, Steven
In this project TDA Research, Inc (TDA) has developed a new post combustion carbon capture technology based on a vacuum swing adsorption system that uses a steam purge and demonstrated its technical feasibility and economic viability in laboratory-scale tests and tests in actual coal derived flue gas. TDA uses an advanced physical adsorbent to selectively remove CO 2 from the flue gas. The sorbent exhibits a much higher affinity for CO 2 than N 2, H 2O or O 2, enabling effective CO 2 separation from the flue gas. We also carried out a detailed process design and analysis ofmore » the new system as part of both sub-critical and super-critical pulverized coal fired power plants. The new technology uses a low cost, high capacity adsorbent that selectively removes CO 2 in the presence of moisture at the flue gas temperature without a need for significant cooling of the flue gas or moisture removal. The sorbent is based on a TDA proprietary mesoporous carbon that consists of surface functionalized groups that remove CO 2 via physical adsorption. The high surface area and favorable porosity of the sorbent also provides a unique platform to introduce additional functionality, such as active groups to remove trace metals (e.g., Hg, As). In collaboration with the Advanced Power and Energy Program of the University of California, Irvine (UCI), TDA developed system simulation models using Aspen PlusTM simulation software to assess the economic viability of TDA’s VSA-based post-combustion carbon capture technology. The levelized cost of electricity including the TS&M costs for CO 2 is calculated as $116.71/MWh and $113.76/MWh for TDA system integrated with sub-critical and super-critical pulverized coal fired power plants; much lower than the $153.03/MWhand $147.44/MWh calculated for the corresponding amine based systems. The cost of CO 2 captured for TDA’s VSA based system is $38.90 and $39.71 per tonne compared to $65.46 and $66.56 per tonne for amine based system on 2011 $ basis, providing 40% lower cost of CO 2 captured. In this analysis we have used a sorbent life of 4 years. If a longer sorbent life can be maintained (which is not unreasonable for fixed bed commercial PSA systems), this would lower the cost of CO 2 captured by $0.05 per tonne (e.g., to $38.85 and $39.66 per tonne at 5 years sorbent replacement). These system analysis results suggest that TDA’s VSA-based post-combustion capture technology can substantially improve the power plant’s thermal performance while achieving near zero emissions, including greater than 90% carbon capture. The higher net plant efficiency and lower capital and operating costs results in a substantial reduction in the cost of carbon capture and cost of electricity for the power plant equipped with TDA’s technology.« less
Proulx, Gilbert; Rodtka, Dwight
2015-01-01
Although predation bounty programs (rewards offered for capturing or killing an animal) ended more than 40 years ago in Canada, they were reintroduced in Alberta in 2007 by hunting, trapping, and farming organizations, municipalities and counties, and in 2009 in Saskatchewan, by municipal and provincial governments and the Saskatchewan Cattlemen’s Association. Bounty hunters use inhumane and non-selective killing methods such as shooting animals in non-vital regions, and killing neck snares and strychnine poisoning, which cause suffering and delayed deaths. They are unselective, and kill many non-target species, some of them at risk. Predator bounty programs have been found to be ineffective by wildlife professionals, and they use killing methods that cause needless suffering and jeopardize wildlife conservation programs. Our analysis therefore indicates that government agencies should not permit the implementation of bounty programs. Accordingly, they must develop conservation programs that will minimize wildlife-human conflicts, prevent the unnecessary and inhumane killing of animals, and ensure the persistence of all wildlife species. PMID:26479482
Proulx, Gilbert; Rodtka, Dwight
2015-10-19
Although predation bounty programs (rewards offered for capturing or killing an animal) ended more than 40 years ago in Canada, they were reintroduced in Alberta in 2007 by hunting, trapping, and farming organizations, municipalities and counties, and in 2009 in Saskatchewan, by municipal and provincial governments and the Saskatchewan Cattlemen's Association. Bounty hunters use inhumane and non-selective killing methods such as shooting animals in non-vital regions, and killing neck snares and strychnine poisoning, which cause suffering and delayed deaths. They are unselective, and kill many non-target species, some of them at risk. Predator bounty programs have been found to be ineffective by wildlife professionals, and they use killing methods that cause needless suffering and jeopardize wildlife conservation programs. Our analysis therefore indicates that government agencies should not permit the implementation of bounty programs. Accordingly, they must develop conservation programs that will minimize wildlife-human conflicts, prevent the unnecessary and inhumane killing of animals, and ensure the persistence of all wildlife species.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brian McPherson
The Southwest Partnership on Carbon Sequestration completed its Phase I program in December 2005. The main objective of the Southwest Partnership Phase I project was to evaluate and demonstrate the means for achieving an 18% reduction in carbon intensity by 2012. Many other goals were accomplished on the way to this objective, including (1) analysis of CO{sub 2} storage options in the region, including characterization of storage capacities and transportation options, (2) analysis and summary of CO{sub 2} sources, (3) analysis and summary of CO{sub 2} separation and capture technologies employed in the region, (4) evaluation and ranking of themore » most appropriate sequestration technologies for capture and storage of CO{sub 2} in the Southwest Region, (5) dissemination of existing regulatory/permitting requirements, and (6) assessing and initiating public knowledge and acceptance of possible sequestration approaches. Results of the Southwest Partnership's Phase I evaluation suggested that the most convenient and practical ''first opportunities'' for sequestration would lie along existing CO{sub 2} pipelines in the region. Action plans for six Phase II validation tests in the region were developed, with a portfolio that includes four geologic pilot tests distributed among Utah, New Mexico, and Texas. The Partnership will also conduct a regional terrestrial sequestration pilot program focusing on improved terrestrial MMV methods and reporting approaches specific for the Southwest region. The sixth and final validation test consists of a local-scale terrestrial pilot involving restoration of riparian lands for sequestration purposes. The validation test will use desalinated waters produced from one of the geologic pilot tests. The Southwest Regional Partnership comprises a large, diverse group of expert organizations and individuals specializing in carbon sequestration science and engineering, as well as public policy and outreach. These partners include 21 state government agencies and universities, five major electric utility companies, seven oil, gas and coal companies, three federal agencies, the Navajo Nation, several NGOs, and the Western Governors Association. This group is continuing its work in the Phase II Validation Program, slated to conclude in 2009.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nelson, J. G.; Morton, R. L.; Castillo, C.
2011-02-01
A multi-level (facility and programmatic) risk assessment was conducted for the facilities in the Nevada National Security Site (NNSS) Readiness in Technical Base and Facilities (RTBF) Program and results were included in a new Risk Management Plan (RMP), which was incorporated into the fiscal year (FY) 2010 Integrated Plans. Risks, risk events, probability, consequence(s), and mitigation strategies were identified and captured, for most scope areas (i.e., risk categories) during the facilitated risk workshops. Risk mitigations (i.e., efforts in addition to existing controls) were identified during the facilitated risk workshops when the risk event was identified. Risk mitigation strategies fell intomore » two broad categories: threats or opportunities. Improvement projects were identified and linked to specific risks they mitigate, making the connection of risk reduction through investments for the annual Site Execution Plan. Due to the amount of that was collected, analysis to be performed, and reports to be generated, a Risk Assessment/ Management Tool (RAMtool) database was developed to analyze the risks in real-time, at multiple levels, which reinforced the site-level risk management process and procedures. The RAMtool database was developed and designed to assist in the capturing and analysis of the key elements of risk: probability, consequence, and impact. The RAMtool calculates the facility-level and programmatic-level risk factors to enable a side-by-side comparison to see where the facility manager and program manager should focus their risk reduction efforts and funding. This enables them to make solid decisions on priorities and funding to maximize the risk reduction. A more active risk management process was developed where risks and opportunities are actively managed, monitored, and controlled by each facility more aggressively and frequently. risk owners have the responsibility and accountability to manage their assigned risk in real-time, using the RAMtool database.« less
Boeing CST-100 Starliner Seat Test
2017-02-21
Engineers working with Boeing's CST-100 Starliner test the spacecraft's seat design in Mesa, Arizona, focusing on how the spacecraft seats would protect an astronaut's head, neck and spine during the 240-mile descent from the International Space Station. The company incorporated test dummies for a detailed analysis of impacts on a crew returning to earth. The human-sized dummies were equipped with sensitive instrumentation and secured in the seats for 30 drop tests at varying heights, angles, velocities and seat orientations in order to mimic actual landing conditions. High-speed cameras captured the footage for further analysis. The Starliner spacecraft is being developed in partnership with NASA's Commercial Crew Program.
A model for long-distance dispersal of boll weevils (Coleoptera: Curculionidae)
NASA Astrophysics Data System (ADS)
Westbrook, John K.; Eyster, Ritchie S.; Allen, Charles T.
2011-07-01
The boll weevil, Anthonomus grandis (Boheman), has been a major insect pest of cotton production in the US, accounting for yield losses and control costs on the order of several billion US dollars since the introduction of the pest in 1892. Boll weevil eradication programs have eliminated reproducing populations in nearly 94%, and progressed toward eradication within the remaining 6%, of cotton production areas. However, the ability of weevils to disperse and reinfest eradicated zones threatens to undermine the previous investment toward eradication of this pest. In this study, the HYSPLIT atmospheric dispersion model was used to simulate daily wind-aided dispersal of weevils from the Lower Rio Grande Valley (LRGV) of southern Texas and northeastern Mexico. Simulated weevil dispersal was compared with weekly capture of weevils in pheromone traps along highway trap lines between the LRGV and the South Texas / Winter Garden zone of the Texas Boll Weevil Eradication Program. A logistic regression model was fit to the probability of capturing at least one weevil in individual pheromone traps relative to specific values of simulated weevil dispersal, which resulted in 60.4% concordance, 21.3% discordance, and 18.3% ties in estimating captures and non-captures. During the first full year of active eradication with widespread insecticide applications in 2006, the dispersal model accurately estimated 71.8%, erroneously estimated 12.5%, and tied 15.7% of capture and non-capture events. Model simulations provide a temporal risk assessment over large areas of weevil reinfestation resulting from dispersal by prevailing winds. Eradication program managers can use the model risk assessment information to effectively schedule and target enhanced trapping, crop scouting, and insecticide applications.
A model for long-distance dispersal of boll weevils (Coleoptera: Curculionidae).
Westbrook, John K; Eyster, Ritchie S; Allen, Charles T
2011-07-01
The boll weevil, Anthonomus grandis (Boheman), has been a major insect pest of cotton production in the US, accounting for yield losses and control costs on the order of several billion US dollars since the introduction of the pest in 1892. Boll weevil eradication programs have eliminated reproducing populations in nearly 94%, and progressed toward eradication within the remaining 6%, of cotton production areas. However, the ability of weevils to disperse and reinfest eradicated zones threatens to undermine the previous investment toward eradication of this pest. In this study, the HYSPLIT atmospheric dispersion model was used to simulate daily wind-aided dispersal of weevils from the Lower Rio Grande Valley (LRGV) of southern Texas and northeastern Mexico. Simulated weevil dispersal was compared with weekly capture of weevils in pheromone traps along highway trap lines between the LRGV and the South Texas/Winter Garden zone of the Texas Boll Weevil Eradication Program. A logistic regression model was fit to the probability of capturing at least one weevil in individual pheromone traps relative to specific values of simulated weevil dispersal, which resulted in 60.4% concordance, 21.3% discordance, and 18.3% ties in estimating captures and non-captures. During the first full year of active eradication with widespread insecticide applications in 2006, the dispersal model accurately estimated 71.8%, erroneously estimated 12.5%, and tied 15.7% of capture and non-capture events. Model simulations provide a temporal risk assessment over large areas of weevil reinfestation resulting from dispersal by prevailing winds. Eradication program managers can use the model risk assessment information to effectively schedule and target enhanced trapping, crop scouting, and insecticide applications.
Expressions Module for the Satellite Orbit Analysis Program
NASA Technical Reports Server (NTRS)
Edmonds, Karina
2008-01-01
The Expressions Module is a software module that has been incorporated into the Satellite Orbit Analysis Program (SOAP). The module includes an expressions- parser submodule built on top of an analytical system, enabling the user to define logical and numerical variables and constants. The variables can capture output from SOAP orbital-prediction and geometric-engine computations. The module can combine variables and constants with built-in logical operators (such as Boolean AND, OR, and NOT), relational operators (such as >, <, or =), and mathematical operators (such as addition, subtraction, multiplication, division, modulus, exponentiation, differentiation, and integration). Parentheses can be used to specify precedence of operations. The module contains a library of mathematical functions and operations, including logarithms, trigonometric functions, Bessel functions, minimum/ maximum operations, and floating- point-to-integer conversions. The module supports combinations of time, distance, and angular units and has a dimensional- analysis component that checks for correct usage of units. A parser based on the Flex language and the Bison program looks for and indicates errors in syntax. SOAP expressions can be built using other expressions as arguments, thus enabling the user to build analytical trees. A graphical user interface facilitates use.
Systems Analysis of Physical Absorption of CO2 in Ionic Liquids for Pre-Combustion Carbon Capture.
Zhai, Haibo; Rubin, Edward S
2018-04-17
This study develops an integrated technical and economic modeling framework to investigate the feasibility of ionic liquids (ILs) for precombustion carbon capture. The IL 1-hexyl-3-methylimidazolium bis(trifluoromethylsulfonyl)imide is modeled as a potential physical solvent for CO 2 capture at integrated gasification combined cycle (IGCC) power plants. The analysis reveals that the energy penalty of the IL-based capture system comes mainly from the process and product streams compression and solvent pumping, while the major capital cost components are the compressors and absorbers. On the basis of the plant-level analysis, the cost of CO 2 avoided by the IL-based capture and storage system is estimated to be $63 per tonne of CO 2 . Technical and economic comparisons between IL- and Selexol-based capture systems at the plant level show that an IL-based system could be a feasible option for CO 2 capture. Improving the CO 2 solubility of ILs can simplify the capture process configuration and lower the process energy and cost penalties to further enhance the viability of this technology.
NASA Technical Reports Server (NTRS)
Alexander, W. M.; Tanner, William G.; Mcdonald, R. A.; Schaub, G. E.; Stephenson, Stepheni L.; Mcdonnell, J. A. M.; Maag, Carl R.
1994-01-01
The return of a pristine sample from a comet would lead to greater understanding of cometary structures, as well as offering insights into exobiology. The paper presented at the Discovery Program Workshop outlined a set of measurements for what was identified as a SOCCER-like interplanetary mission. Several experiments comprised the total instrumentation. This paper presents a summary of CCSR with an overview of three of the four major instruments. Details of the major dust dynamics experiment including trajectory are given in this paper. The instrument proposed here offers the opportunity for the return of cometary dust particles gathered in situ. The capture process has been employed aboard the space shuttle with successful results in returning samples to Earth for laboratory analysis. In addition, the sensors will measure the charge, mass, velocity, and size of cometary dust grains during the encounter. This data will help our understanding of dusty plasmas.
NASA Technical Reports Server (NTRS)
Hohwiesner, Bill; Claudinon, Bernard
1991-01-01
The European Space Agency (ESA) has been working to develop an autonomous rendezvous and docking capability since 1984 to enable Hermes to automatically dock with Columbus. As a result, ESA with Matra, MBB, and other space companies have developed technologies that are also directly supportive of the current NASA initiative for Automated Rendezvous and Capture. Fairchild and Matra would like to discuss the results of the applicable ESA/Matra rendezvous and capture developments, and suggest how these capabilities could be used, together with an existing NASA Explorer Platform satellite, to minimize new development and accomplish a cost effective automatic closure and capture demonstration program. Several RV sensors have been developed at breadboard level for the Hermes/Columbus program by Matra, MBB, and SAAB. Detailed algorithms for automatic rendezvous, closure, and capture have been developed by ESA and CNES for application with Hermes to Columbus rendezvous and docking, and they currently are being verified with closed-loop software simulation. The algorithms have multiple closed-loop control modes and phases starting at long range using GPS navigation. Differential navigation is used for coast/continuous thrust homing, holdpoint acquisition, V-bar hopping, and station point acquisition. The proximity operation sensor is used for final closure and capture. A subset of these algorithms, comprising the proximity operations algorithms, could easily be extracted and tailored to a limited objective closure and capture flight demonstration.
An Effective Model of the Retinoic Acid Induced HL-60 Differentiation Program.
Tasseff, Ryan; Jensen, Holly A; Congleton, Johanna; Dai, David; Rogers, Katharine V; Sagar, Adithya; Bunaciu, Rodica P; Yen, Andrew; Varner, Jeffrey D
2017-10-30
In this study, we present an effective model All-Trans Retinoic Acid (ATRA)-induced differentiation of HL-60 cells. The model describes reinforcing feedback between an ATRA-inducible signalsome complex involving many proteins including Vav1, a guanine nucleotide exchange factor, and the activation of the mitogen activated protein kinase (MAPK) cascade. We decomposed the effective model into three modules; a signal initiation module that sensed and transformed an ATRA signal into program activation signals; a signal integration module that controlled the expression of upstream transcription factors; and a phenotype module which encoded the expression of functional differentiation markers from the ATRA-inducible transcription factors. We identified an ensemble of effective model parameters using measurements taken from ATRA-induced HL-60 cells. Using these parameters, model analysis predicted that MAPK activation was bistable as a function of ATRA exposure. Conformational experiments supported ATRA-induced bistability. Additionally, the model captured intermediate and phenotypic gene expression data. Knockout analysis suggested Gfi-1 and PPARg were critical to the ATRAinduced differentiation program. These findings, combined with other literature evidence, suggested that reinforcing feedback is central to hyperactive signaling in a diversity of cell fate programs.
NursesforTomorrow: a proactive approach to nursing resource analysis.
Bournes, Debra A; Plummer, Carolyn; Miller, Robert; Ferguson-Paré, Mary
2010-03-01
This paper describes the background, development, implementation and utilization of NursesforTomorrow (N4T), a practical and comprehensive nursing human resources analysis method to capture regional, institutional and patient care unit-specific actual and predicted nurse vacancies, nurse staff characteristics and nurse staffing changes. Reports generated from the process include forecasted shortfalls or surpluses of nurses, percentage of novice nurses, occupancy, sick time, overtime, agency use and other metrics. Readers will benefit from a description of the ways in which the data generated from the nursing resource analysis process are utilized at senior leadership, program and unit levels to support proactive hiring and resource allocation decisions and to predict unit-specific recruitment and retention patterns across multiple healthcare organizations and regions.
DOT National Transportation Integrated Search
2012-10-01
The Connected Vehicle Mobility Standards Coordination Plan project links activities in three programs (Data Capture and Management, Dynamic Mobility Applications, and ITS Standards). The plan coordinates the timing, intent and relationship of activit...
Explaining Verification Conditions
NASA Technical Reports Server (NTRS)
Deney, Ewen; Fischer, Bernd
2006-01-01
The Hoare approach to program verification relies on the construction and discharge of verification conditions (VCs) but offers no support to trace, analyze, and understand the VCs themselves. We describe a systematic extension of the Hoare rules by labels so that the calculus itself can be used to build up explanations of the VCs. The labels are maintained through the different processing steps and rendered as natural language explanations. The explanations can easily be customized and can capture different aspects of the VCs; here, we focus on their structure and purpose. The approach is fully declarative and the generated explanations are based only on an analysis of the labels rather than directly on the logical meaning of the underlying VCs or their proofs. Keywords: program verification, Hoare calculus, traceability.
Image processing system design for microcantilever-based optical readout infrared arrays
NASA Astrophysics Data System (ADS)
Tong, Qiang; Dong, Liquan; Zhao, Yuejin; Gong, Cheng; Liu, Xiaohua; Yu, Xiaomei; Yang, Lei; Liu, Weiyu
2012-12-01
Compared with the traditional infrared imaging technology, the new type of optical-readout uncooled infrared imaging technology based on MEMS has many advantages, such as low cost, small size, producing simple. In addition, the theory proves that the technology's high thermal detection sensitivity. So it has a very broad application prospects in the field of high performance infrared detection. The paper mainly focuses on an image capturing and processing system in the new type of optical-readout uncooled infrared imaging technology based on MEMS. The image capturing and processing system consists of software and hardware. We build our image processing core hardware platform based on TI's high performance DSP chip which is the TMS320DM642, and then design our image capturing board based on the MT9P031. MT9P031 is Micron's company high frame rate, low power consumption CMOS chip. Last we use Intel's company network transceiver devices-LXT971A to design the network output board. The software system is built on the real-time operating system DSP/BIOS. We design our video capture driver program based on TI's class-mini driver and network output program based on the NDK kit for image capturing and processing and transmitting. The experiment shows that the system has the advantages of high capturing resolution and fast processing speed. The speed of the network transmission is up to 100Mbps.
Interactive Voice/Web Response System in clinical research
Ruikar, Vrishabhsagar
2016-01-01
Emerging technologies in computer and telecommunication industry has eased the access to computer through telephone. An Interactive Voice/Web Response System (IxRS) is one of the user friendly systems for end users, with complex and tailored programs at its backend. The backend programs are specially tailored for easy understanding of users. Clinical research industry has experienced revolution in methodologies of data capture with time. Different systems have evolved toward emerging modern technologies and tools in couple of decades from past, for example, Electronic Data Capture, IxRS, electronic patient reported outcomes, etc. PMID:26952178
Interactive Voice/Web Response System in clinical research.
Ruikar, Vrishabhsagar
2016-01-01
Emerging technologies in computer and telecommunication industry has eased the access to computer through telephone. An Interactive Voice/Web Response System (IxRS) is one of the user friendly systems for end users, with complex and tailored programs at its backend. The backend programs are specially tailored for easy understanding of users. Clinical research industry has experienced revolution in methodologies of data capture with time. Different systems have evolved toward emerging modern technologies and tools in couple of decades from past, for example, Electronic Data Capture, IxRS, electronic patient reported outcomes, etc.
Capturing, Codifying and Scoring Complex Data for Innovative, Computer-Based Items.
ERIC Educational Resources Information Center
Luecht, Richard M.
The Microsoft Certification Program (MCP) includes many new computer-based item types, based on complex cases involving the Windows 2000 (registered) operating system. This Innovative Item Technology (IIT) has presented challenges beyond traditional psychometric considerations such as capturing and storing the relevant response data from…
None
2017-12-09
NETL's Carbon Sequestration Program is helping to develop technologies to capture, purify, and store carbon dioxide (CO2) in order to reduce greenhouse gas emissions without adversely influencing energy use or hindering economic growth. Carbon sequestration technologies capture and store CO2 that would otherwise reside in the atmosphere for long periods of time.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-09
... (PPRs) to capture quarterly and annual reports for each project type (Infrastructure, Public Computer... Information Collection; Comment Request; Broadband Technology Opportunities Program (BTOP) Quarterly and..., which included competitive grants to expand public computer center capacity and innovative programs to...
Performance Characteristics of a Kernel-Space Packet Capture Module
2010-03-01
Defense, or the United States Government . AFIT/GCO/ENG/10-03 PERFORMANCE CHARACTERISTICS OF A KERNEL-SPACE PACKET CAPTURE MODULE THESIS Presented to the...3.1.2.3 Prototype. The proof of concept for this research is the design, development, and comparative performance analysis of a kernel level N2d capture...changes to kernel code 5. Can be used for both user-space and kernel-space capture applications in order to control comparative performance analysis to
Purity and cleanness of aerogel as a cosmic dust capture medium
NASA Technical Reports Server (NTRS)
Tsou, P.; Fleming, R. H.; Lindley, P. M.; Craig, A. Y.; Blake, D.
1994-01-01
The capability for capturing micrometeoroids intact through laboratory simulations and in space in passive underdense silica aerogel offers a valuable tool for cosmic dust research. The integrity of the sample handling medium can substantially modify the integrity of the sample. Intact capture is a violent hypervelocity event: the integrity of the capturing medium can cause even greater modification of the sample. Doubts of the suitability of silica aerogel as a capture medium were raised at the 20th LPSC, and questions were raised again at the recent workshop on Particle Capture, Recovery, and Velocity Trajectory Measurement Technologies. Assessment of aerogel's volatile components and carbon contents have been made. We report the results of laboratory measurements of the purity and cleanliness of silica aerogel used for several Sample Return Experiments flown on the Get Away Special program.
Cropper, Douglas P; Harb, Nidal H; Said, Patricia A; Lemke, Jon H; Shammas, Nicolas W
2018-04-01
We hypothesize that implementation of a safety program based on high reliability organization principles will reduce serious safety events (SSE). The safety program focused on 7 essential elements: (a) safety rounding, (b) safety oversight teams, (c) safety huddles, (d) safety coaches, (e) good catches/safety heroes, (f) safety education, and (g) red rule. An educational curriculum was implemented focusing on changing high-risk behaviors and implementing critical safety policies. All unusual occurrences were captured in the Midas system and investigated by risk specialists, the safety officer, and the chief medical officer. A multidepartmental committee evaluated these events, and a root cause analysis (RCA) was performed. Events were tabulated and serious safety event (SSE) recorded and plotted over time. Safety success stories (SSSs) were also evaluated over time. A steady drop in SSEs was seen over 9 years. Also a rise in SSSs was evident, reflecting on staff engagement in the program. The parallel change in SSEs, SSSs, and the implementation of various safety interventions highly suggest that the program was successful in achieving its goals. A safety program based on high-reliability organization principles and made a core value of the institution can have a significant positive impact on reducing SSEs. © 2018 American Society for Healthcare Risk Management of the American Hospital Association.
Jeschke, E Ann
2016-01-01
In 2014, the Institute of Medicine published a meta-analysis on current military reintegration programs, suggesting they have failed to improve postdeployment behavioral health. In this chapter, I explore some of the issues associated with the two paradigm reintegration programs supported by the Department of Defense (DoD), namely, BATTLEMIND postdeployment debriefings and Master Resilience Training. My discussion will be located within a subpopulation of military personnel I call warriors, particularly those men who have been exposed to combat. In performing a normative analysis of current reintegration programs, I rely on an ethics of embodied personal presence as a derivative focus of both nursing ethics and the just war tradition. Using an interdisciplinary approach to evaluate warriors' experiences of training across the military life cycle illustrates how reintegration challenges have been construed as potential pathology because disembodied reintegration programs do not consider the influence of military training and lifestyle in the development of certain health behaviors. When compared to the warrior's lived experience, a broader set of reintegration challenges emerge that cannot be fully captured by the symptoms of posttraumatic stress. Therefore, new reintegration programs need to be developed. Although I do not provide explicit details concerning what these reintegration programs should look at, I suggest that the DoD turn to something akin to the Healthy People campaign.
NASA Astrophysics Data System (ADS)
Imani Masouleh, Mehdi; Limebeer, David J. N.
2018-07-01
In this study we will estimate the region of attraction (RoA) of the lateral dynamics of a nonlinear single-track vehicle model. The tyre forces are approximated using rational functions that are shown to capture the nonlinearities of tyre curves significantly better than polynomial functions. An existing sum-of-squares (SOS) programming algorithm for estimating regions of attraction is extended to accommodate the use of rational vector fields. This algorithm is then used to find an estimate of the RoA of the vehicle lateral dynamics. The influence of vehicle parameters and driving conditions on the stability region are studied. It is shown that SOS programming techniques can be used to approximate the stability region without resorting to numerical integration. The RoA estimate from the SOS algorithm is compared to the existing results in the literature. The proposed method is shown to obtain significantly better RoA estimates.
Basner, Jodi E; Theisz, Katrina I; Jensen, Unni S; Jones, C David; Ponomarev, Ilya; Sulima, Pawel; Jo, Karen; Eljanne, Mariam; Espey, Michael G; Franca-Koh, Jonathan; Hanlon, Sean E; Kuhn, Nastaran Z; Nagahara, Larry A; Schnell, Joshua D; Moore, Nicole M
2013-12-01
Development of effective quantitative indicators and methodologies to assess the outcomes of cross-disciplinary collaborative initiatives has the potential to improve scientific program management and scientific output. This article highlights an example of a prospective evaluation that has been developed to monitor and improve progress of the National Cancer Institute Physical Sciences-Oncology Centers (PS-OC) program. Study data, including collaboration information, was captured through progress reports and compiled using the web-based analytic database: Interdisciplinary Team Reporting, Analysis, and Query Resource. Analysis of collaborations was further supported by data from the Thomson Reuters Web of Science database, MEDLINE database, and a web-based survey. Integration of novel and standard data sources was augmented by the development of automated methods to mine investigator pre-award publications, assign investigator disciplines, and distinguish cross-disciplinary publication content. The results highlight increases in cross-disciplinary authorship collaborations from pre- to post-award years among the primary investigators and confirm that a majority of cross-disciplinary collaborations have resulted in publications with cross-disciplinary content that rank in the top third of their field. With these evaluation data, PS-OC Program officials have provided ongoing feedback to participating investigators to improve center productivity and thereby facilitate a more successful initiative. Future analysis will continue to expand these methods and metrics to adapt to new advances in research evaluation and changes in the program.
Gibau, Gina Sanchez
2015-01-01
Qualitative studies that examine the experiences of underrepresented minority students in science, technology, engineering, and mathematics fields are comparatively few. This study explores the self-reported experiences of underrepresented graduate students in the biomedical sciences of a large, midwestern, urban university. Document analysis of interview transcripts from program evaluations capture firsthand accounts of student experiences and reveal the need for a critical examination of current intervention programs designed to reverse the trend of underrepresentation in the biomedical sciences. Findings point to themes aligned around the benefits and challenges of program components, issues of social adjustment, the utility of supportive relationships, and environmental impacts. © 2015 G. S. Gibau. CBE—Life Sciences Education © 2015 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).
Thege, Fredrik I; Lannin, Timothy B; Saha, Trisha N; Tsai, Shannon; Kochman, Michael L; Hollingsworth, Michael A; Rhim, Andrew D; Kirby, Brian J
2014-05-21
We have developed and optimized a microfluidic device platform for the capture and analysis of circulating pancreatic cells (CPCs) and pancreatic circulating tumor cells (CTCs). Our platform uses parallel anti-EpCAM and cancer-specific mucin 1 (MUC1) immunocapture in a silicon microdevice. Using a combination of anti-EpCAM and anti-MUC1 capture in a single device, we are able to achieve efficient capture while extending immunocapture beyond single marker recognition. We also have detected a known oncogenic KRAS mutation in cells spiked in whole blood using immunocapture, RNA extraction, RT-PCR and Sanger sequencing. To allow for downstream single-cell genetic analysis, intact nuclei were released from captured cells by using targeted membrane lysis. We have developed a staining protocol for clinical samples, including standard CTC markers; DAPI, cytokeratin (CK) and CD45, and a novel marker of carcinogenesis in CPCs, mucin 4 (MUC4). We have also demonstrated a semi-automated approach to image analysis and CPC identification, suitable for clinical hypothesis generation. Initial results from immunocapture of a clinical pancreatic cancer patient sample show that parallel capture may capture more of the heterogeneity of the CPC population. With this platform, we aim to develop a diagnostic biomarker for early pancreatic carcinogenesis and patient risk stratification.
Evaluation of the childhood obesity prevention program Kids--'Go for your life'.
de Silva-Sanigorski, Andrea; Prosser, Lauren; Carpenter, Lauren; Honisett, Suzy; Gibbs, Lisa; Moodie, Marj; Sheppard, Lauren; Swinburn, Boyd; Waters, Elizabeth
2010-05-28
Kids--'Go for your life' (K-GFYL) is an award-based health promotion program being implemented across Victoria, Australia. The program aims to reduce the risk of childhood obesity by improving the socio-cultural, policy and physical environments in children's care and educational settings. Membership of the K-GFYL program is open to all primary and pre-schools and early childhood services across the State. Once in the program, member schools and services are centrally supported to undertake the health promotion (intervention) activities. Once the K-GFYL program 'criteria' are reached the school/service is assessed and 'awarded'. This paper describes the design of the evaluation of the statewide K-GFYL intervention program. The evaluation is mixed method and cross sectional and aims to: 1) Determine if K-GFYL award status is associated with more health promoting environments in schools/services compared to those who are members only; 2) Determine if children attending K-GFYL award schools/services have higher levels of healthy eating and physical activity-related behaviors compared to those who are members only; 3) Examine the barriers to implementing and achieving the K-GFYL award; and 4) Determine the economic cost of implementing K-GFYL in primary schools. Parent surveys will capture information about the home environment and child dietary and physical activity-related behaviors. Environmental questionnaires in early childhood settings and schools will capture information on the physical activity and nutrition environment and current health promotion activities. Lunchbox surveys and a set of open-ended questions for kindergarten parents will provide additional data. Resource use associated with the intervention activities will be collected from primary schools for cost analysis. The K-GFYL award program is a community-wide intervention that requires a comprehensive, multi-level evaluation. The evaluation design is constrained by the lack of a non-K-GFYL control group, short time frames and delayed funding of this large scale evaluation across all intervention settings. However, despite this, the evaluation will generate valuable evidence about the utility of a community-wide environmental approach to preventing childhood obesity which will inform future public health policies and health promotion programs internationally. ACTRN12609001075279.
U.S. Spacesuit Knowledge Capture Accomplishments in Fiscal Years 2012 and 2013
NASA Technical Reports Server (NTRS)
Chullen, Cinda; Oliva, Vladenka R.
2014-01-01
The NASA U.S. spacesuit knowledge capture (KC) program has been in operations since the beginning 2008. The program was designed to augment engineers and others with information about spacesuits in a historical way. A multitude of seminars have captured spacesuit history and knowledge over the last six years of the programs existence. Subject matter experts have provided lectures and were interviewed to help bring the spacesuit to life so that lessons learned will never be lost. As well, the program concentrated in reaching out to the public and industry by making the recorded events part of the public domain through the NASA technical library via You Tube media. The U.S. spacesuit KC topics have included lessons learned from some of the most prominent spacesuit experts and spacesuit users including current and former astronauts. The events have enriched the spacesuit legacy knowledge from Gemini, Apollo, Skylab, Space Shuttle and International Space Station Programs. As well, expert engineers and scientists have shared their challenges and successes to be remembered. The last few years have been some of the most successful years of the KC program program's life with numerous recordings and releases to the public. It is evidenced by the thousands that have view the recordings online. This paper reviews the events accomplished and archived over Fiscal Years 2012 and 2013 and highlights a few of the most memorable ones. This paper also communicates ways to access the events that are available internally to NASA as well as in the public domain.
A DNAzyme-mediated logic gate for programming molecular capture and release on DNA origami.
Li, Feiran; Chen, Haorong; Pan, Jing; Cha, Tae-Gon; Medintz, Igor L; Choi, Jong Hyun
2016-06-28
Here we design a DNA origami-based site-specific molecular capture and release platform operated by a DNAzyme-mediated logic gate process. We show the programmability and versatility of this platform with small molecules, proteins, and nanoparticles, which may also be controlled by external light signals.
Profcasts and Class Attendance--Does Year in Program Matter?
ERIC Educational Resources Information Center
Holbrook, Jane; Dupont, Christine
2009-01-01
The use of technology to capture the audio and visual elements of lectures, to engage students in course concepts, and to provide feedback to assignments has become a mainstream practice in higher education through podcasting and lecture capturing mechanisms. Instructors can create short podcasts or videos to produce "nuggets" of information for…
Managing complex processing of medical image sequences by program supervision techniques
NASA Astrophysics Data System (ADS)
Crubezy, Monica; Aubry, Florent; Moisan, Sabine; Chameroy, Virginie; Thonnat, Monique; Di Paola, Robert
1997-05-01
Our objective is to offer clinicians wider access to evolving medical image processing (MIP) techniques, crucial to improve assessment and quantification of physiological processes, but difficult to handle for non-specialists in MIP. Based on artificial intelligence techniques, our approach consists in the development of a knowledge-based program supervision system, automating the management of MIP libraries. It comprises a library of programs, a knowledge base capturing the expertise about programs and data and a supervision engine. It selects, organizes and executes the appropriate MIP programs given a goal to achieve and a data set, with dynamic feedback based on the results obtained. It also advises users in the development of new procedures chaining MIP programs.. We have experimented the approach for an application of factor analysis of medical image sequences as a means of predicting the response of osteosarcoma to chemotherapy, with both MRI and NM dynamic image sequences. As a result our program supervision system frees clinical end-users from performing tasks outside their competence, permitting them to concentrate on clinical issues. Therefore our approach enables a better exploitation of possibilities offered by MIP and higher quality results, both in terms of robustness and reliability.
An approximate methods approach to probabilistic structural analysis
NASA Technical Reports Server (NTRS)
Mcclung, R. C.; Millwater, H. R.; Wu, Y.-T.; Thacker, B. H.; Burnside, O. H.
1989-01-01
A major research and technology program in Probabilistic Structural Analysis Methods (PSAM) is currently being sponsored by the NASA Lewis Research Center with Southwest Research Institute as the prime contractor. This program is motivated by the need to accurately predict structural response in an environment where the loadings, the material properties, and even the structure may be considered random. The heart of PSAM is a software package which combines advanced structural analysis codes with a fast probability integration (FPI) algorithm for the efficient calculation of stochastic structural response. The basic idea of PAAM is simple: make an approximate calculation of system response, including calculation of the associated probabilities, with minimal computation time and cost, based on a simplified representation of the geometry, loads, and material. The deterministic solution resulting should give a reasonable and realistic description of performance-limiting system responses, although some error will be inevitable. If the simple model has correctly captured the basic mechanics of the system, however, including the proper functional dependence of stress, frequency, etc. on design parameters, then the response sensitivities calculated may be of significantly higher accuracy.
An Aeroelastic Analysis of a Thin Flexible Membrane
NASA Technical Reports Server (NTRS)
Scott, Robert C.; Bartels, Robert E.; Kandil, Osama A.
2007-01-01
Studies have shown that significant vehicle mass and cost savings are possible with the use of ballutes for aero-capture. Through NASA's In-Space Propulsion program, a preliminary examination of ballute sensitivity to geometry and Reynolds number was conducted, and a single-pass coupling between an aero code and a finite element solver was used to assess the static aeroelastic effects. There remain, however, a variety of open questions regarding the dynamic aeroelastic stability of membrane structures for aero-capture, with the primary challenge being the prediction of the membrane flutter onset. The purpose of this paper is to describe and begin addressing these issues. The paper includes a review of the literature associated with the structural analysis of membranes and membrane utter. Flow/structure analysis coupling and hypersonic flow solver options are also discussed. An approach is proposed for tackling this problem that starts with a relatively simple geometry and develops and evaluates analysis methods and procedures. This preliminary study considers a computationally manageable 2-dimensional problem. The membrane structural models used in the paper include a nonlinear finite-difference model for static and dynamic analysis and a NASTRAN finite element membrane model for nonlinear static and linear normal modes analysis. Both structural models are coupled with a structured compressible flow solver for static aeroelastic analysis. For dynamic aeroelastic analyses, the NASTRAN normal modes are used in the structured compressible flow solver and 3rd order piston theories were used with the finite difference membrane model to simulate utter onset. Results from the various static and dynamic aeroelastic analyses are compared.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-06
... Program/Cost Report (RSA 2) collects data on the vocational rehabilitation (VR) and supported employment... (Rehabilitation Act). The RSA-2 captures: administrative expenditures for the VR and SE programs; VR program... for the VR program by number of individuals served; the costs of types of services provided; and a...
ERIC Educational Resources Information Center
Dyehouse, Melissa; Bennett, Deborah; Harbor, Jon; Childress, Amy; Dark, Melissa
2009-01-01
Logic models are based on linear relationships between program resources, activities, and outcomes, and have been used widely to support both program development and evaluation. While useful in describing some programs, the linear nature of the logic model makes it difficult to capture the complex relationships within larger, multifaceted…
MEMBRANE PROCESS TO SEQUESTER CO2 FROM POWER PLANT FLUE GAS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tim Merkel; Karl Amo; Richard Baker
2009-03-31
The objective of this project was to assess the feasibility of using a membrane process to capture CO2 from coal-fired power plant flue gas. During this program, MTR developed a novel membrane (Polaris™) with a CO2 permeance tenfold higher than commercial CO2-selective membranes used in natural gas treatment. The Polaris™ membrane, combined with a process design that uses a portion of combustion air as a sweep stream to generate driving force for CO2 permeation, meets DOE post-combustion CO2 capture targets. Initial studies indicate a CO2 separation and liquefaction cost of $20 - $30/ton CO2 using about 15% of the plantmore » energy at 90% CO2 capture from a coal-fired power plant. Production of the Polaris™ CO2 capture membrane was scaled up with MTR’s commercial casting and coating equipment. Parametric tests of cross-flow and countercurrent/sweep modules prepared from this membrane confirm their near-ideal performance under expected flue gas operating conditions. Commercial-scale, 8-inch diameter modules also show stable performance in field tests treating raw natural gas. These findings suggest that membranes are a viable option for flue gas CO2 capture. The next step will be to conduct a field demonstration treating a realworld power plant flue gas stream. The first such MTR field test will capture 1 ton CO2/day at Arizona Public Service’s Cholla coal-fired power plant, as part of a new DOE NETL funded program.« less
77 FR 39224 - Notice of Proposed Information Collection Requests; Office of Special Education and...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-02
..., Rehabilitation Services Administration (RSA)-2 collects data on the vocational rehabilitation (VR) and supported... Rehabilitation Program/Cost Report (RSA-2) collects data on the vocational rehabilitation (VR) and supported... (Rehabilitation Act). The RSA-2 captures: administrative expenditures for the VR and SE programs; VR program...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ackermann, A.L.
1991-08-01
This Bulletin presents a summary of accomplishments and highlights in the Idaho National Engineering Laboratory's (INEL) Boron Neutron Capture Therapy (BNCT) Program for August 1991. This bulletin includes information on the brain tumor and melanoma research programs, Power Burst Facility (PBF) technical support and modifications, PBF operations, and updates to the animal data charts.
Converting from XML to HDF-EOS
NASA Technical Reports Server (NTRS)
Ullman, Richard; Bane, Bob; Yang, Jingli
2008-01-01
A computer program recreates an HDF-EOS file from an Extensible Markup Language (XML) representation of the contents of that file. This program is one of two programs written to enable testing of the schemas described in the immediately preceding article to determine whether the schemas capture all details of HDF-EOS files.
Ranking Surgical Residency Programs: Reputation Survey or Outcomes Measures?
Wilson, Adam B; Torbeck, Laura J; Dunnington, Gary L
2015-01-01
The release of general surgery residency program rankings by Doximity and U.S. News & World Report accentuates the need to define and establish measurable standards of program quality. This study evaluated the extent to which program rankings based solely on peer nominations correlated with familiar program outcomes measures. Publicly available data were collected for all 254 general surgery residency programs. To generate a rudimentary outcomes-based program ranking, surgery programs were rank-ordered according to an average percentile rank that was calculated using board pass rates and the prevalence of alumni publications. A Kendall τ-b rank correlation computed the linear association between program rankings based on reputation alone and those derived from outcomes measures to validate whether reputation was a reasonable surrogate for globally judging program quality. For the 218 programs with complete data eligible for analysis, the mean board pass rate was 72% with a standard deviation of 14%. A total of 60 programs were placed in the 75th percentile or above for the number of publications authored by program alumni. The correlational analysis reported a significant correlation of 0.428, indicating only a moderate association between programs ranked by outcomes measures and those ranked according to reputation. Seventeen programs that were ranked in the top 30 according to reputation were also ranked in the top 30 based on outcomes measures. This study suggests that reputation alone does not fully capture a representative snapshot of a program's quality. Rather, the use of multiple quantifiable indicators and attributes unique to programs ought to be given more consideration when assigning ranks to denote program quality. It is advised that the interpretation and subsequent use of program rankings be met with caution until further studies can rigorously demonstrate best practices for awarding program standings. Copyright © 2015 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
Data reduction programs for a laser radar system
NASA Technical Reports Server (NTRS)
Badavi, F. F.; Copeland, G. E.
1984-01-01
The listing and description of software routines which were used to analyze the analog data obtained from LIDAR - system are given. All routines are written in FORTRAN - IV on a HP - 1000/F minicomputer which serves as the heart of the data acquisition system for the LIDAR program. This particular system has 128 kilobytes of highspeed memory and is equipped with a Vector Instruction Set (VIS) firmware package, which is used in all the routines, to handle quick execution of different long loops. The system handles floating point arithmetic in hardware in order to enhance the speed of execution. This computer is a 2177 C/F series version of HP - 1000 RTE-IVB data acquisition computer system which is designed for real time data capture/analysis and disk/tape mass storage environment.
Schuler, M; Musekamp, G; Bengel, J; Schwarze, M; Spanier, K; Gutenbrunner, Chr; Ehlebracht-König, I; Nolte, S; Osborne, R H; Faller, H
2014-11-01
To assess stable effects of self-management programs, measurement instruments should primarily capture the attributes of interest, for example, the self-management skills of the measured persons. However, measurements of psychological constructs are always influenced by both aspects of the situation (states) and aspects of the person (traits). This study tests whether the Health Education Impact Questionnaire (heiQ™), an instrument assessing a wide range of proximal outcomes of self-management programs, is primarily influenced by person factors instead of situational factors. Furthermore, measurement invariance over time, changes in traits and predictors of change for each heiQ™ scale were examined. Subjects were N = 580 patients with rheumatism, asthma, orthopedic conditions or inflammatory bowel disease, who filled out the heiQ™ at the beginning, the end of and 3 months after a disease-specific inpatient rehabilitation program in Germany. Structural equation modeling techniques were used to estimate latent trait-change models and test for measurement invariance in each heiQ™ scale. Coefficients of consistency, occasion specificity and reliability were computed. All scales showed scalar invariance over time. Reliability coefficients were high (0.80-0.94), and consistency coefficients (0.49-0.79) were always substantially higher than occasion specificity coefficients (0.14-0.38), indicating that the heiQ™ scales primarily capture person factors. Trait-changes with small to medium effect sizes were shown in five scales and were affected by sex, age and diagnostic group. The heiQ™ can be used to assess stable effects in important outcomes of self-management programs over time, e.g., changes in self-management skills or emotional well-being.
Qureshi, Samera Azeem; Lund, Annette Christin; Veierød, Marit Bragelien; Carlsen, Monica Hauger; Blomhoff, Rune; Andersen, Lene Frost; Ursin, Giske
2014-01-16
Fruit and vegetable intake has been found to reduce the risk of cardiovascular disease, certain types of cancer and diabetes mellitus. It is possible that antioxidants play a large part in this protective effect. However, which foods account for the variation in antioxidant intake in a population is not very clear. We used food frequency data from a population-based sample of women to identify the food items that contributed most to the variation in antioxidant intake in Norwegian diet. We used data from a study conducted among participants in the Norwegian Breast Cancer Screening Program (NBCSP), the national program which invites women aged 50-69 years to mammographic screening every 2 years. A subset of 6514 women who attended the screening in 2006/2007 completed a food frequency questionnaire (FFQ). Daily intake of energy, nutrients and antioxidant intake were estimated. We used multiple linear regression analysis to capture the variation in antioxidant intake. The mean (SD) antioxidant intake was 23.0 (8.5) mmol/day. Coffee consumption explained 54% of the variation in antioxidant intake, while fruits and vegetables explained 22%. The twenty food items that contributed most to the total variation in antioxidant intake explained 98% of the variation in intake. These included different types of coffee, tea, red wine, blueberries, walnuts, oranges, cinnamon and broccoli. In this study we identified a list of food items which capture the variation in antioxidant intake among these women. The major contributors to dietary total antioxidant intake were coffee, tea, red wine, blueberries, walnuts, oranges, cinnamon and broccoli. These items should be assessed in as much detail as possible in studies that wish to capture the variation in antioxidant intake.
On Designing Multicore-Aware Simulators for Systems Biology Endowed with OnLine Statistics
Calcagno, Cristina; Coppo, Mario
2014-01-01
The paper arguments are on enabling methodologies for the design of a fully parallel, online, interactive tool aiming to support the bioinformatics scientists .In particular, the features of these methodologies, supported by the FastFlow parallel programming framework, are shown on a simulation tool to perform the modeling, the tuning, and the sensitivity analysis of stochastic biological models. A stochastic simulation needs thousands of independent simulation trajectories turning into big data that should be analysed by statistic and data mining tools. In the considered approach the two stages are pipelined in such a way that the simulation stage streams out the partial results of all simulation trajectories to the analysis stage that immediately produces a partial result. The simulation-analysis workflow is validated for performance and effectiveness of the online analysis in capturing biological systems behavior on a multicore platform and representative proof-of-concept biological systems. The exploited methodologies include pattern-based parallel programming and data streaming that provide key features to the software designers such as performance portability and efficient in-memory (big) data management and movement. Two paradigmatic classes of biological systems exhibiting multistable and oscillatory behavior are used as a testbed. PMID:25050327
Brunner, J; Krummenauer, F; Lehr, H A
2000-04-01
Study end-points in microcirculation research are usually video-taped images rather than numeric computer print-outs. Analysis of these video-taped images for the quantification of microcirculatory parameters usually requires computer-based image analysis systems. Most software programs for image analysis are custom-made, expensive, and limited in their applicability to selected parameters and study end-points. We demonstrate herein that an inexpensive, commercially available computer software (Adobe Photoshop), run on a Macintosh G3 computer with inbuilt graphic capture board provides versatile, easy to use tools for the quantification of digitized video images. Using images obtained by intravital fluorescence microscopy from the pre- and postischemic muscle microcirculation in the skinfold chamber model in hamsters, Photoshop allows simple and rapid quantification (i) of microvessel diameters, (ii) of the functional capillary density and (iii) of postischemic leakage of FITC-labeled high molecular weight dextran from postcapillary venules. We present evidence of the technical accuracy of the software tools and of a high degree of interobserver reliability. Inexpensive commercially available imaging programs (i.e., Adobe Photoshop) provide versatile tools for image analysis with a wide range of potential applications in microcirculation research.
HIV incidence and CDC's HIV prevention budget: an exploratory correlational analysis.
Holtgrave, David R; Kates, Jennifer
2007-01-01
The central evaluative question about a national HIV prevention program is whether that program affects HIV incidence. Numerous factors may influence incidence, including public investment in HIV prevention. Few studies, however, have examined the relationship between public investment and the HIV epidemic in the United States. This 2006 exploratory analysis examined the period from 1978 through 2006 using a quantitative, lagged, correlational analysis to capture the relationship between national HIV incidence and Centers for Disease Control and Prevention's HIV prevention budget in the United States over time. The analyses suggest that early HIV incidence rose in advance of the nation's HIV prevention investment until the mid-1980s (1-year lag correlation, r=0.972, df=2, p <0.05). From that point on, it appears that the nation's investment in HIV prevention became a strong correlate of HIV incidence (1-year lag correlation, r=-0.905, df=18, p <0.05). This exploratory study provides correlational evidence of a relationship between U.S. HIV incidence and the federal HIV prevention budget over time, and calls for further analysis of the role of funding and other factors that may influence the direction of a nation's HIV epidemic.
On designing multicore-aware simulators for systems biology endowed with OnLine statistics.
Aldinucci, Marco; Calcagno, Cristina; Coppo, Mario; Damiani, Ferruccio; Drocco, Maurizio; Sciacca, Eva; Spinella, Salvatore; Torquati, Massimo; Troina, Angelo
2014-01-01
The paper arguments are on enabling methodologies for the design of a fully parallel, online, interactive tool aiming to support the bioinformatics scientists .In particular, the features of these methodologies, supported by the FastFlow parallel programming framework, are shown on a simulation tool to perform the modeling, the tuning, and the sensitivity analysis of stochastic biological models. A stochastic simulation needs thousands of independent simulation trajectories turning into big data that should be analysed by statistic and data mining tools. In the considered approach the two stages are pipelined in such a way that the simulation stage streams out the partial results of all simulation trajectories to the analysis stage that immediately produces a partial result. The simulation-analysis workflow is validated for performance and effectiveness of the online analysis in capturing biological systems behavior on a multicore platform and representative proof-of-concept biological systems. The exploited methodologies include pattern-based parallel programming and data streaming that provide key features to the software designers such as performance portability and efficient in-memory (big) data management and movement. Two paradigmatic classes of biological systems exhibiting multistable and oscillatory behavior are used as a testbed.
Handbook of capture-recapture analysis
Amstrup, Steven C.; McDonald, Trent L.; Manly, Bryan F.J.
2005-01-01
Every day, biologists in parkas, raincoats, and rubber boots go into the field to capture and mark a variety of animal species. Back in the office, statisticians create analytical models for the field biologists' data. But many times, representatives of the two professions do not fully understand one another's roles. This book bridges this gap by helping biologists understand state-of-the-art statistical methods for analyzing capture-recapture data. In so doing, statisticians will also become more familiar with the design of field studies and with the real-life issues facing biologists.Reliable outcomes of capture-recapture studies are vital to answering key ecological questions. Is the population increasing or decreasing? Do more or fewer animals have a particular characteristic? In answering these questions, biologists cannot hope to capture and mark entire populations. And frequently, the populations change unpredictably during a study. Thus, increasingly sophisticated models have been employed to convert data into answers to ecological questions. This book, by experts in capture-recapture analysis, introduces the most up-to-date methods for data analysis while explaining the theory behind those methods. Thorough, concise, and portable, it will be immensely useful to biologists, biometricians, and statisticians, students in both fields, and anyone else engaged in the capture-recapture process.
Treatment Cost Analysis Tool (TCAT) for Estimating Costs of Outpatient Treatment Services
Flynn, Patrick M.; Broome, Kirk M.; Beaston-Blaakman, Aaron; Knight, Danica K.; Horgan, Constance M.; Shepard, Donald S.
2009-01-01
A Microsoft® Excel-based workbook designed for research analysts to use in a national study was retooled for treatment program directors and financial officers to allocate, analyze, and estimate outpatient treatment costs in the U.S. This instrument can also be used as a planning and management tool to optimize resources and forecast the impact of future changes in staffing, client flow, program design, and other resources. The Treatment Cost Analysis Tool (TCAT) automatically provides feedback and generates summaries and charts using comparative data from a national sample of non-methadone outpatient providers. TCAT is being used by program staff to capture and allocate both economic and accounting costs, and outpatient service costs are reported for a sample of 70 programs. Costs for an episode of treatment in regular, intensive, and mixed types of outpatient treatment types were $882, $1,310, and $1,381 respectively (based on 20% trimmed means and 2006 dollars). An hour of counseling cost $64 in regular, $85 intensive, and $86 mixed. Group counseling hourly costs per client were $8, $11, and $10 respectively for regular, intensive, and mixed. Future directions include use of a web-based interview version, much like some of the commercially available tax preparation software tools, and extensions for use in other modalities of treatment. PMID:19004576
Treatment Cost Analysis Tool (TCAT) for estimating costs of outpatient treatment services.
Flynn, Patrick M; Broome, Kirk M; Beaston-Blaakman, Aaron; Knight, Danica K; Horgan, Constance M; Shepard, Donald S
2009-02-01
A Microsoft Excel-based workbook designed for research analysts to use in a national study was retooled for treatment program directors and financial officers to allocate, analyze, and estimate outpatient treatment costs in the U.S. This instrument can also be used as a planning and management tool to optimize resources and forecast the impact of future changes in staffing, client flow, program design, and other resources. The Treatment Cost Analysis Tool (TCAT) automatically provides feedback and generates summaries and charts using comparative data from a national sample of non-methadone outpatient providers. TCAT is being used by program staff to capture and allocate both economic and accounting costs, and outpatient service costs are reported for a sample of 70 programs. Costs for an episode of treatment in regular, intensive, and mixed types of outpatient treatment were $882, $1310, and $1381 respectively (based on 20% trimmed means and 2006 dollars). An hour of counseling cost $64 in regular, $85 intensive, and $86 mixed. Group counseling hourly costs per client were $8, $11, and $10 respectively for regular, intensive, and mixed. Future directions include use of a web-based interview version, much like some of the commercially available tax preparation software tools, and extensions for use in other modalities of treatment.
Assessment of the NASA Space Shuttle Program's Problem Reporting and Corrective Action System
NASA Technical Reports Server (NTRS)
Korsmeryer, D. J.; Schreiner, J. A.; Norvig, Peter (Technical Monitor)
2001-01-01
This paper documents the general findings and recommendations of the Design for Safety Programs Study of the Space Shuttle Programs (SSP) Problem Reporting and Corrective Action (PRACA) System. The goals of this Study were: to evaluate and quantify the technical aspects of the SSP's PRACA systems, and to recommend enhancements addressing specific deficiencies in preparation for future system upgrades. The Study determined that the extant SSP PRACA systems accomplished a project level support capability through the use of a large pool of domain experts and a variety of distributed formal and informal database systems. This operational model is vulnerable to staff turnover and loss of the vast corporate knowledge that is not currently being captured by the PRACA system. A need for a Program-level PRACA system providing improved insight, unification, knowledge capture, and collaborative tools was defined in this study.
NASA Technical Reports Server (NTRS)
Benbenek, Daniel B.; Walsh, William
2010-01-01
This greenbook captures some of the current, planned and possible future uses of the Internet Protocol (IP) as part of Space Operations. It attempts to describe how the Internet Protocol is used in specific scenarios. Of primary focus is low-earth-orbit space operations, which is referred to here as the design reference mission (DRM). This is because most of the program experience drawn upon derives from this type of mission. Application profiles are provided. This includes parameter settings programs have proposed for sending IP datagrams over CCSDS links, the minimal subsets and features of the IP protocol suite and applications expected for interoperability between projects, and the configuration, operations and maintenance of these IP functions. Of special interest is capturing the lessons learned from the Constellation Program in this area, since that program included a fairly ambitious use of the Internet Protocol.
Turpin, Aaron; Shier, Micheal L
2017-01-01
Improvements to intrapersonal development of clients involved with substance use disorder treatment programs has widely been recognized as contributing to the intended goal of reducing substance misuse behaviors. This study sought to identify a broad framework of primary outcomes related to the intrapersonal development of clients in treatment for substance misuse. Using qualitative research methods, individual interviews were conducted with program participants (n = 41) at three treatment programs to identify the ways in which respondents experienced intrapersonal development through participation in treatment. The findings support the development of a conceptual model that captures the importance and manifestation of achieving improvements in the following outcomes: self-awareness, coping ability, self-worth, outlook, and self-determination. The findings provide a conceptual framework for client assessment that captures a broad range of the important intrapersonal development factors utilized as indicators for client development and recovery that should be measured in tandem during assessment.
Microfluidic-Based Enrichment and Retrieval of Circulating Tumor Cells for RT-PCR Analysis.
Gogoi, Priya; Sepehri, Saedeh; Chow, Will; Handique, Kalyan; Wang, Yixin
2017-01-01
Molecular analysis of circulating tumor cells (CTCs) is hindered by low sensitivity and high level of background leukocytes of currently available CTC enrichment technologies. We have developed a novel device to enrich and retrieve CTCs from blood samples by using a microfluidic chip. The Celsee PREP100 device captures CTCs with high sensitivity and allows the captured CTCs to be retrieved for molecular analysis. It uses the microfluidic chip which has approximately 56,320 capture chambers. Based on differences in cell size and deformability, each chamber ensures that small blood escape while larger CTCs of varying sizes are trapped and isolated in the chambers. In this report, we used the Celsee PREP100 to capture cancer cells spiked into normal donor blood samples. We were able to show that the device can capture as low as 10 cells with high reproducibility. The captured CTCs were retrieved from the microfluidic chip. The cell recovery rate of this back-flow procedure is 100% and the level of remaining background leukocytes is very low (about 300-400 cells). RNA from the retrieved cells are extracted and converted to cDNA, and gene expression analysis of selected cancer markers can be carried out by using RT-PCR assays. The sensitive and easy-to-use Celsee PREP100 system represents a promising technology for capturing and molecular characterization of CTCs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vahdat, Nader
2013-09-30
The project provided hands-on training and networking opportunities to undergraduate students in the area of carbon dioxide (CO2) capture and transport, through fundamental research study focused on advanced separation methods that can be applied to the capture of CO2 resulting from the combustion of fossil-fuels for power generation . The project team’s approach to achieve its objectives was to leverage existing Carbon Capture and Storage (CCS) course materials and teaching methods to create and implement an annual CCS short course for the Tuskegee University community; conduct a survey of CO2 separation and capture methods; utilize data to verify and developmore » computer models for CO2 capture and build CCS networks and hands-on training experiences. The objectives accomplished as a result of this project were: (1) A comprehensive survey of CO2 capture methods was conducted and mathematical models were developed to compare the potential economics of the different methods based on the total cost per year per unit of CO2 avoidance; and (2) Training was provided to introduce the latest CO2 capture technologies and deployment issues to the university community.« less
Effect of seeding on the capture of six stored product beetle species: The relative species matters
USDA-ARS?s Scientific Manuscript database
n trapping programs prior capture of individuals of the same or different species may influence subsequent attractiveness of the trap. To evaluate this process with stored-product insects, the effect of the presence of dead or alive adults on the behavioral responses of six stored product insect spe...
What's in a Name? Degree Programs and What They Tell Us about "Applied Linguistics" in Australia
ERIC Educational Resources Information Center
Murray, Neil; Crichton, Jonathan
2010-01-01
In this paper we explore the provision of applied linguistics within Australian universities. We focus on how the "what" of applied linguistics, as captured in scholarly definitions of the discipline, accords with the "where", as captured in potential contexts of application as these are manifested in provision. In doing so, we…
Quick and Easy: Use Screen Capture Software to Train and Communicate
ERIC Educational Resources Information Center
Schuster, Ellen
2011-01-01
Screen capture (screen cast) software can be used to develop short videos for training purposes. Developing videos is quick and easy. This article describes how these videos are used as tools to reinforce face-to-face and interactive TV curriculum training in a nutrition education program. Advantages of developing these videos are shared.…
USDA-ARS?s Scientific Manuscript database
Attractant-based traps are a cornerstone of detection, delimitation and eradication programs for tephritid fruit flies and other pests. The ideal trap and lure combination has high attraction (it brings pest tephritids to the trap from a distance) and high capture efficiency (it has a high probabili...
Spurgeon, Dale W
2016-12-01
The boll weevil (Anthonomus grandis grandis Boheman) has been eradicated from much of the United States, but remains an important pest of cotton (Gossypium spp.) in other parts of the Americas. Where the weevil occurs, the pheromone trap is a key tool for population monitoring or detection. Traditional monitoring programs have placed traps in or near the outermost cotton rows where damage by farm equipment can cause loss of trapping data. Recently, some programs have adopted a trap placement adjacent to but outside monitored fields. The effects of these changes have not been previously reported. Captures of early-season boll weevils by traps near (≤1 m) or far (7-10 m) from the outermost cotton row were evaluated. In 2005, during renewed efforts to eradicate the boll weevil from the Lower Rio Grande Valley of Texas, far traps consistently captured more weevils than traps near cotton. Traps at both placements indicated similar patterns of early-season weevil captures, which were consistent with those previously reported. In 2006, no distinction between trap placements was detected. Early-season patterns of captures in 2006 were again similar for both trap placements, but captures were much lower and less regular compared with those observed in 2005. These results suggest magnitude and likelihood of weevil capture in traps placed away from cotton are at least as high as for traps adjacent to cotton. Therefore, relocation of traps away from the outer rows of cotton should not negatively impact ability to monitor or detect the boll weevil. Published by Oxford University Press on behalf of Entomological Society of America 2016. This work is written by a US Government employee and is in the public domain in the US.
A Systematic Approach for Evaluation of Capture Zones at Pump and Treat Systems
This document describes a systematic approach for performing capture zone analysis associated with ground water pump and treat systems. A “capture zone” refers to the three-dimensional region that contributes the ground water extracted by one or more wells or drains. A capture ...
Autebert, Julien; Coudert, Benoit; Champ, Jérôme; Saias, Laure; Guneri, Ezgi Tulukcuoglu; Lebofsky, Ronald; Bidard, François-Clément; Pierga, Jean-Yves; Farace, Françoise; Descroix, Stéphanie; Malaquin, Laurent; Viovy, Jean-Louis
2015-05-07
A new generation of the Ephesia cell capture technology optimized for CTC capture and genetic analysis is presented, characterized in depth and compared with the CellSearch system as a reference. This technology uses magnetic particles bearing tumour-cell specific EpCAM antibodies, self-assembled in a regular array in a microfluidic flow cell. 48,000 high aspect-ratio columns are generated using a magnetic field in a high throughput (>3 ml h(-1)) device and act as sieves to specifically capture the cells of interest through antibody-antigen interactions. Using this device optimized for CTC capture and analysis, we demonstrated the capture of epithelial cells with capture efficiency above 90% for concentrations as low as a few cells per ml. We showed the high specificity of capture with only 0.26% of non-epithelial cells captured for concentrations above 10 million cells per ml. We investigated the capture behavior of cells in the device, and correlated the cell attachment rate with the EpCAM expression on the cell membranes for six different cell lines. We developed and characterized a two-step blood processing method to allow for rapid processing of 10 ml blood tubes in less than 4 hours, and showed a capture rate of 70% for as low as 25 cells spiked in 10 ml blood tubes, with less than 100 contaminating hematopoietic cells. Using this device and procedure, we validated our system on patient samples using an automated cell immunostaining procedure and a semi-automated cell counting method. Our device captured CTCs in 75% of metastatic prostate cancer patients and 80% of metastatic breast cancer patients, and showed similar or better results than the CellSearch device in 10 out of 13 samples. Finally, we demonstrated the possibility of detecting cancer-related PIK3CA gene mutation in 20 cells captured in the chip with a good correlation between the cell count and the quantitation value Cq of the post-capture qPCR.
NASA Technical Reports Server (NTRS)
Johnson, Teresa A.
2006-01-01
Knowledge Management is a proactive pursuit for the future success of any large organization faced with the imminent possibility that their senior managers/engineers with gained experiences and lessons learned plan to retire in the near term. Safety and Mission Assurance (S&MA) is proactively pursuing unique mechanism to ensure knowledge learned is retained and lessons learned captured and documented. Knowledge Capture Event/Activities/Management helps to provide a gateway between future retirees and our next generation of managers/engineers. S&MA hosted two Knowledge Capture Events during 2005 featuring three of its retiring fellows (Axel Larsen, Dave Whittle and Gary Johnson). The first Knowledge Capture Event February 24, 2005 focused on two Safety and Mission Assurance Safety Panels (Space Shuttle System Safety Review Panel (SSRP); Payload Safety Review Panel (PSRP) and the latter event December 15, 2005 featured lessons learned during Apollo, Skylab, and Space Shuttle which could be applicable in the newly created Crew Exploration Vehicle (CEV)/Constellation development program. Gemini, Apollo, Skylab and the Space Shuttle promised and delivered exciting human advances in space and benefits of space in people s everyday lives on earth. Johnson Space Center's Safety & Mission Assurance team work over the last 20 years has been mostly focused on operations we are now beginning the Exploration development program. S&MA will promote an atmosphere of knowledge sharing in its formal and informal cultures and work processes, and reward the open dissemination and sharing of information; we are asking "Why embrace relearning the "lessons learned" in the past?" On the Exploration program the focus will be on Design, Development, Test, & Evaluation (DDT&E); therefore, it is critical to understand the lessons from these past programs during the DDT&E phase.
Meteoroid capture cell construction
NASA Technical Reports Server (NTRS)
Zook, H. A.; High, R. W. (Inventor)
1976-01-01
A thin membrane covering the open side of a meteoroid capture cell causes an impacting meteoroid to disintegrate as it penetrates the membrane. The capture cell then contains and holds the meteoroid particles for later analysis.
Seamless presentation capture, indexing, and management
NASA Astrophysics Data System (ADS)
Hilbert, David M.; Cooper, Matthew; Denoue, Laurent; Adcock, John; Billsus, Daniel
2005-10-01
Technology abounds for capturing presentations. However, no simple solution exists that is completely automatic. ProjectorBox is a "zero user interaction" appliance that automatically captures, indexes, and manages presentation multimedia. It operates continuously to record the RGB information sent from presentation devices, such as a presenter's laptop, to display devices, such as a projector. It seamlessly captures high-resolution slide images, text and audio. It requires no operator, specialized software, or changes to current presentation practice. Automatic media analysis is used to detect presentation content and segment presentations. The analysis substantially enhances the web-based user interface for browsing, searching, and exporting captured presentations. ProjectorBox has been in use for over a year in our corporate conference room, and has been deployed in two universities. Our goal is to develop automatic capture services that address both corporate and educational needs.
Linking animal-borne video to accelerometers reveals prey capture variability.
Watanabe, Yuuki Y; Takahashi, Akinori
2013-02-05
Understanding foraging is important in ecology, as it determines the energy gains and, ultimately, the fitness of animals. However, monitoring prey captures of individual animals is difficult. Direct observations using animal-borne videos have short recording periods, and indirect signals (e.g., stomach temperature) are never validated in the field. We took an integrated approach to monitor prey captures by a predator by deploying a video camera (lasting for 85 min) and two accelerometers (on the head and back, lasting for 50 h) on free-swimming Adélie penguins. The movies showed that penguins moved the heads rapidly to capture krill in midwater and fish (Pagothenia borchgrevinki) underneath the sea ice. Captures were remarkably fast (two krill per second in swarms) and efficient (244 krill or 33 P. borchgrevinki in 78-89 min). Prey captures were detected by the signal of head acceleration relative to body acceleration with high sensitivity and specificity (0.83-0.90), as shown by receiver-operating characteristic analysis. Extension of signal analysis to the entire behavioral records showed that krill captures were spatially and temporally more variable than P. borchgrevinki captures. Notably, the frequency distribution of krill capture rate closely followed a power-law model, indicating that the foraging success of penguins depends on a small number of very successful dives. The three steps illustrated here (i.e., video observations, linking video to behavioral signals, and extension of signal analysis) are unique approaches to understanding the spatial and temporal variability of ecologically important events such as foraging.
Report on all ARRA Funded Technical Work
DOE Office of Scientific and Technical Information (OSTI.GOV)
None, None
2013-10-05
The main focus of this American Recovery and Reinvestment Act of 2009 (ARRA) funded project was to design an energy efficient carbon capture and storage (CCS) process using the Recipients membrane system for H{sub 2} separation and CO{sub 2} capture. In the ARRA-funded project, the Recipient accelerated development and scale-up of ongoing hydrogen membrane technology research and development (R&D). Specifically, this project focused on accelerating the current R&D work scope of the base program-funded project, involving lab scale tests, detail design of a 250 lb/day H{sub 2} process development unit (PDU), and scale-up of membrane tube and coating manufacturing. Thismore » project scope included the site selection and a Front End Engineering Design (FEED) study of a nominally 4 to 10 ton-per-day (TPD) Pre-Commercial Module (PCM) hydrogen separation membrane system. Process models and techno-economic analysis were updated to include studies on integration of this technology into an Integrated Gasification Combined Cycle (IGCC) power generation system with CCS.« less
Hu, Chenghuan; Huang, Feizhou; Zhang, Rui; Zhu, Shaihong; Nie, Wanpin; Liu, Xunyang; Liu, Yinglong; Li, Peng
2015-01-01
Using optics combined with automatic control and computer real-time image detection technology, a novel noninvasive method of noncontact pressure manometry was developed based on the airflow and laser detection technology in this study. The new esophageal venous pressure measurement system was tested in-vitro experiments. A stable and adjustable pulse stream was produced from a self-developed pump and a laser emitting apparatus could generate optical signals which can be captured by image acquisition and analysis system program. A synchronization system simultaneous measured the changes of air pressure and the deformation of the vein wall to capture the vascular deformation while simultaneously record the current pressure value. The results of this study indicated that the pressure values tested by the new method have good correlation with the actual pressure value in animal experiments. The new method of noninvasive pressure measurement based on the airflow and laser detection technology is accurate, feasible, repeatable and has a good application prospects.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huml, O.
The objective of this work was to determine the neutron flux density distribution in various places of the training reactor VR-1 Sparrow. This experiment was performed on the new core design C1, composed of the new low-enriched uranium fuel cells IRT-4M (19.7 %). This fuel replaced the old high-enriched uranium fuel IRT-3M (36 %) within the framework of the RERTR Program in September 2005. The measurement used the neutron activation analysis method with gold wires. The principle of this method consists in neutron capture in a nucleus of the material forming the activation detector. This capture can change the nucleusmore » in a radioisotope, whose activity can be measured. The absorption cross-section values were evaluated by MCNP computer code. The gold wires were irradiated in seven different positions in the core C1. All irradiations were performed at reactor power level 1E8 (1 kW{sub therm}). The activity of segments of irradiated wires was measured by special automatic device called 'Drat' (Wire in English). (author)« less
Capturing dynamic processes of change in GROW mutual help groups for mental health.
Finn, Lizzie D; Bishop, Brian J; Sparrow, Neville
2009-12-01
The need for a model that can portray dynamic processes of change in mutual help groups for mental health (MHGMHs) is emphasized. A dynamic process model has the potential to capture a more comprehensive understanding of how MHGMHs may assist their members. An investigation into GROW, a mutual help organization for mental health, employed ethnographic, phenomenological and collaborative research methods. The study examined how GROW impacts on psychological well being. Study outcomes aligned with the social ecological paradigm (Maton in Understanding the self-help organization: frameworks and findings. Sage, Thousand Oaks 1994) indicating multifactorial processes of change at and across three levels of analysis: group level, GROW program/community level and individual level. Outcome themes related to life skills acquisition and a change in self-perception in terms of belonging within community and an increased sense of personal value. The GROW findings are used to assist development of a dynamic multi-dimensional process model to explain how MHGMHs may promote positive change.
Enhanced capture of elemental mercury by bamboo-based sorbents.
Tan, Zengqiang; Xiang, Jun; Su, Sheng; Zeng, Hancai; Zhou, Changsong; Sun, Lushi; Hu, Song; Qiu, Jianrong
2012-11-15
To develop cost-effective sorbent for gas-phase elemental mercury removal, the bamboo charcoal (BC) produced from renewable bamboo and KI modified BC (BC-I) were used for elemental mercury removal. The effect of NO, SO2 on gas-phase Hg0 adsorption by KI modified BC was evaluated on a fixed bed reactor using an online mercury analyzer. BET surface area analysis, temperature programmed desorption (TPD) and X-ray photoelectron spectroscopy (XPS) were used to determine the pore structure and surface chemistry of the sorbents. The results show that KI impregnation reduced the sorbents' BET surface area and total pore volume compared with that of the original BC. But the BC-I has excellent adsorption capacity for elemental mercury at a relatively higher temperature of 140 °C and 180 °C. The presence of NO or SO2 could inhibit Hg0 capture, but BC-I has strong anti-poisoning ability. The specific reaction mechanism has been further analyzed. Copyright © 2012 Elsevier B.V. All rights reserved.
Flexcam Image Capture Viewing and Spot Tracking
NASA Technical Reports Server (NTRS)
Rao, Shanti
2008-01-01
Flexcam software was designed to allow continuous monitoring of the mechanical deformation of the telescope structure at Palomar Observatory. Flexcam allows the user to watch the motion of a star with a low-cost astronomical camera, to measure the motion of the star on the image plane, and to feed this data back into the telescope s control system. This automatic interaction between the camera and a user interface facilitates integration and testing. Flexcam is a CCD image capture and analysis tool for the ST-402 camera from Santa Barbara Instruments Group (SBIG). This program will automatically take a dark exposure and then continuously display corrected images. The image size, bit depth, magnification, exposure time, resolution, and filter are always displayed on the title bar. Flexcam locates the brightest pixel and then computes the centroid position of the pixels falling in a box around that pixel. This tool continuously writes the centroid position to a network file that can be used by other instruments.
Health Science Students' Perception about Research Training Programs Offered in Saudi Universities
ERIC Educational Resources Information Center
Al Kuwaiti, Ahmed; Subbarayalu, Arun Vijay
2015-01-01
Purpose: The purpose of this paper was to examine the perceptions of students of health sciences on research training programs offered at Saudi universities. Design/methodology/approach: A cross-sectional survey design was adopted to capture the perceptions of health science students about research training programs offered at selected Saudi…
USDA-ARS?s Scientific Manuscript database
Eradication programs for the boll weevil (Anthonomus grandis Boheman) rely on pheromone-baited traps to trigger insecticide treatments and monitor program progress. A key objective of monitoring in these programs is the timely detection of incipient weevil populations to limit or prevent re-infestat...
Regulatory considerations of occupational tuberculosis control.
McDiarmid, M A; Gillen, N A; Hathon, L
1994-01-01
The authors argue that the classic hierarchy of industrial hygiene controls may be successfully used to control TB. Various elements of hygiene control programs reviewed here include TB exposure control programs, identification and isolation of patients, respiratory isolation, local source capture ventilation, laboratory procedures, employee surveillance programs, reporting of occupational illnesses, labeling requirements, and respiratory protection.
INEL BNCT Program: Volume 5, No. 9
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ackermann, A.L.
1991-01-01
This Bulletin presents a summary of accomplishments and highlights of the Idaho National Engineering Laboratory's (INEL) Boron Neutron Capture Therapy (BNCT) Program for September 1991. This bulletin includes information on the brain tumor and melanoma research programs, Power Burst Facility (PBF) technical support and modifications, PBF operations, and updates to the animal data charts.
A framework for evaluating national space activity
NASA Astrophysics Data System (ADS)
Wood, Danielle; Weigel, Annalisa
2012-04-01
Space technology and resources are used around the world to address societal challenges. Space provides valuable satellite services, unique scientific discoveries, surprising technology applications and new economic opportunities. Many developing countries formally recognize the advantages of space resources and pursue national level activity to harness them. There is limited data or documentation on the space activities of developing countries. Meanwhile, traditional approaches to summarize national space activity do not necessarily capture the types of activity that developing countries pursue in space. This is especially true if they do not have a formal national space program or office. Developing countries pursue national space activity through activities of many types—from national satellite programs to commercial use of satellite services to involvement with international space institutions. This research aims to understand and analyze these trends. This paper introduces two analytical frameworks for evaluating space activity at the national level. The frameworks are specifically designed to capture the activity of countries that have traditionally been less involved in space. They take a broad view of space related activity across multiple societal sectors and disciplines. The discussion explains the approach for using the frameworks as well as illustrative examples of how they can be applied as part of a research process. The first framework is called the Mission and Management Ladders. This framework considers specific space projects within countries and ranks them on "Ladders" that measure technical challenge and managerial autonomy. This first method is at a micro level of analysis. The second framework is called the Space Participation Metric (SPM). The SPM can be used to assign a Space Participation score to countries based on their involvement in various space related activities. This second method uses a macro level of analysis. The authors developed both frameworks as part of a long term research program about the space activities of developing countries. This aspect of the research focuses on harnessing multiple techniques to summarize complex, multi-disciplinary information about global space activity.
Physical Activation of Oil Palm Empty Fruit Bunch via CO2 Activation Gas for CO2 Adsorption
NASA Astrophysics Data System (ADS)
Joseph, C. G.; Quek, K. S.; Daud, W. M. A. W.; Moh, P. Y.
2017-06-01
In this study, different parameters for the preparation of activated carbon were investigated for their yield and CO2 capture capabilities. The activated carbon was prepared from Oil Palm Empty Fruit Bunch (OPEFB) via a 2-step physical activation process. The OPEFB was pyrolyzed under inert conditions at 500 °C and activated via CO2. A 2-factorial design was employed and the effects of activation temperature, activation dwell time and gas flow rate on yield and CO2 capture capabilities were compared and studied. The yield obtained ranged from between 20 - 26, whereby the temperature was determined to be the most significant factor in influencing CO2 uptake. The CO2 capture capacity was determined using Temperature Programmed Desorption (TPD) technique. The CO2 uptake of EFB activated carbon achieved was between 1.85 - 2.09 mmol/g. TPD analysis has shown that the surface of AC were of basic nature. AC was found to be able to withhold the CO2 up to 663°C before maximum desorption occurs. The surface area and pore size of OPEFB obtained from BET analysis is 2.17 m2 g-1 and 0.01 cm3 g-1. After activation, both surface area and pore size increased with a maximum observed surface area and pore size of 548.07 m2 g-1 and 0.26 cm3 g-1. Surface morphology, functional groups, pore size and surface area were analyzed using SEM, FT-IR, TPD and BET.
Vakil, Rachit M.; Chaudhry, Zoobia W.; Doshi, Ruchi S.; Clark, Jeanne M.; Gudzune, Kimberly A.
2017-01-01
Objective To characterize weight-loss claims and disclaimers present on websites for commercial weight-loss programs and compare them to results from published randomized controlled trials (RCT). Methods We performed a content analysis of all homepages and testimonials available on the websites of 24 randomly selected programs. Two team members independently reviewed each page and abstracted information from text and images to capture relevant content including demographics, weight loss, and disclaimers. We performed a systematic review to evaluate the efficacy of these programs by searching MEDLINE and Cochrane Database of Systematic Reviews, and abstracted mean weight change from each included RCT. Results Overall, the amount of weight loss portrayed in the testimonials was extreme across all programs examined (range median weight loss 10.7 to 49.5 kg). Only 10 out of the 24 programs had eligible RCTs. Median weight losses reported in testimonials exceeded that achieved by trial participants. Most programs with RCTs (78%) provided disclaimers stating that the testimonial's results were non-typical and/or giving a range of typical weight loss. Conclusion Weight loss claims within testimonials were higher than results from RCTs. Future studies should examine whether commercial programs' advertising practices influence patients' expectations or satisfaction with modest weight loss results. PMID:28865085
Applications of Automation Methods for Nonlinear Fracture Test Analysis
NASA Technical Reports Server (NTRS)
Allen, Phillip A.; Wells, Douglas N.
2013-01-01
Using automated and standardized computer tools to calculate the pertinent test result values has several advantages such as: 1. allowing high-fidelity solutions to complex nonlinear phenomena that would be impractical to express in written equation form, 2. eliminating errors associated with the interpretation and programing of analysis procedures from the text of test standards, 3. lessening the need for expertise in the areas of solid mechanics, fracture mechanics, numerical methods, and/or finite element modeling, to achieve sound results, 4. and providing one computer tool and/or one set of solutions for all users for a more "standardized" answer. In summary, this approach allows a non-expert with rudimentary training to get the best practical solution based on the latest understanding with minimum difficulty.Other existing ASTM standards that cover complicated phenomena use standard computer programs: 1. ASTM C1340/C1340M-10- Standard Practice for Estimation of Heat Gain or Loss Through Ceilings Under Attics Containing Radiant Barriers by Use of a Computer Program 2. ASTM F 2815 - Standard Practice for Chemical Permeation through Protective Clothing Materials: Testing Data Analysis by Use of a Computer Program 3. ASTM E2807 - Standard Specification for 3D Imaging Data Exchange, Version 1.0 The verification, validation, and round-robin processes required of a computer tool closely parallel the methods that are used to ensure the solution validity for equations included in test standard. The use of automated analysis tools allows the creation and practical implementation of advanced fracture mechanics test standards that capture the physics of a nonlinear fracture mechanics problem without adding undue burden or expense to the user. The presented approach forms a bridge between the equation-based fracture testing standards of today and the next generation of standards solving complex problems through analysis automation.
Pilot testing of a membrane system for postcombustion CO 2 capture
DOE Office of Scientific and Technical Information (OSTI.GOV)
Merkel, Tim; Kniep, Jay; Wei, Xiaotong
2015-09-30
This final report summarizes work conducted for the U.S. Department of Energy, National Energy Technology Laboratory (DOE) to scale up an efficient post-combustion CO 2 capture membrane process to the small pilot test stage (award number DE-FE0005795). The primary goal of this research program was to design, fabricate, and operate a membrane CO 2 capture system to treat coal-derived flue gas containing 20 tonnes CO 2/day (20 TPD). Membrane Technology and Research (MTR) conducted this project in collaboration with Babcock and Wilcox (B&W), the Electric Power Research Institute (EPRI), WorleyParsons (WP), the Illinois Sustainable Technology Center (ISTC), Enerkem (EK), andmore » the National Carbon Capture Center (NCCC). In addition to the small pilot design, build and slipstream testing at NCCC, other project efforts included laboratory membrane and module development at MTR, validation field testing on a 1 TPD membrane system at NCCC, boiler modeling and testing at B&W, a techno-economic analysis (TEA) by EPRI/WP, a case study of the membrane technology applied to a ~20 MWe power plant by ISTC, and an industrial CO 2 capture test at an Enerkem waste-to-biofuel facility. The 20 TPD small pilot membrane system built in this project successfully completed over 1,000 hours of operation treating flue gas at NCCC. The Polaris™ membranes used on this system demonstrated stable performance, and when combined with over 10,000 hours of operation at NCCC on a 1 TPD system, the risk associated with uncertainty in the durability of postcombustion capture membranes has been greatly reduced. Moreover, next-generation Polaris membranes with higher performance and lower cost were validation tested on the 1 TPD system. The 20 TPD system also demonstrated successful operation of a new low-pressure-drop sweep module that will reduce parasitic energy losses at full scale by as much as 10 MWe. In modeling and pilot boiler testing, B&W confirmed the viability of CO 2 recycle to the boiler as envisioned in the MTR process design. The impact of this CO 2 recycle on boiler efficiency was quantified and incorporated into a TEA of the membrane capture process applied to a full-scale power plant. As with previous studies, the TEA showed the membrane process to be lower cost than the conventional solvent capture process even at 90% CO 2capture. A sensitivity study indicates that the membrane capture cost decreases significantly if the 90% capture requirement is relaxed. Depending on the process design, a minimum capture cost is achieved at 30-60% capture, values that would meet proposed CO 2 emission regulations for coal-fired power plants. In summary, this project has successfully advanced the MTR membrane capture process through small pilot testing (technology readiness level 6). The technology is ready for future scale-up to the 10 MWe size.« less
Basner, Jodi E.; Theisz, Katrina I.; Jensen, Unni S.; Jones, C. David; Ponomarev, Ilya; Sulima, Pawel; Jo, Karen; Eljanne, Mariam; Espey, Michael G.; Franca-Koh, Jonathan; Hanlon, Sean E.; Kuhn, Nastaran Z.; Nagahara, Larry A.; Schnell, Joshua D.; Moore, Nicole M.
2013-01-01
Development of effective quantitative indicators and methodologies to assess the outcomes of cross-disciplinary collaborative initiatives has the potential to improve scientific program management and scientific output. This article highlights an example of a prospective evaluation that has been developed to monitor and improve progress of the National Cancer Institute Physical Sciences—Oncology Centers (PS-OC) program. Study data, including collaboration information, was captured through progress reports and compiled using the web-based analytic database: Interdisciplinary Team Reporting, Analysis, and Query Resource. Analysis of collaborations was further supported by data from the Thomson Reuters Web of Science database, MEDLINE database, and a web-based survey. Integration of novel and standard data sources was augmented by the development of automated methods to mine investigator pre-award publications, assign investigator disciplines, and distinguish cross-disciplinary publication content. The results highlight increases in cross-disciplinary authorship collaborations from pre- to post-award years among the primary investigators and confirm that a majority of cross-disciplinary collaborations have resulted in publications with cross-disciplinary content that rank in the top third of their field. With these evaluation data, PS-OC Program officials have provided ongoing feedback to participating investigators to improve center productivity and thereby facilitate a more successful initiative. Future analysis will continue to expand these methods and metrics to adapt to new advances in research evaluation and changes in the program. PMID:24808632
FRAP Analysis: Accounting for Bleaching during Image Capture
Wu, Jun; Shekhar, Nandini; Lele, Pushkar P.; Lele, Tanmay P.
2012-01-01
The analysis of Fluorescence Recovery After Photobleaching (FRAP) experiments involves mathematical modeling of the fluorescence recovery process. An important feature of FRAP experiments that tends to be ignored in the modeling is that there can be a significant loss of fluorescence due to bleaching during image capture. In this paper, we explicitly include the effects of bleaching during image capture in the model for the recovery process, instead of correcting for the effects of bleaching using reference measurements. Using experimental examples, we demonstrate the usefulness of such an approach in FRAP analysis. PMID:22912750
Shock Layer Radiation Measurements and Analysis for Mars Entry
NASA Technical Reports Server (NTRS)
Bose, Deepak; Grinstead, Jay Henderson; Bogdanoff, David W.; Wright, Michael J.
2009-01-01
NASA's In-Space Propulsion program is supporting the development of shock radiation transport models for aerocapture missions to Mars. A comprehensive test series in the NASA Antes Electric Arc Shock Tube facility at a representative flight condition was recently completed. The facility optical instrumentation enabled spectral measurements of shocked gas radiation from the vacuum ultraviolet to the near infrared. The instrumentation captured the nonequilibrium post-shock excitation and relaxation dynamics of dispersed spectral features. A description of the shock tube facility, optical instrumentation, and examples of the test data are presented. Comparisons of measured spectra with model predictions are also made.
Techniques and instrumentation effort for whale migration tracking
NASA Technical Reports Server (NTRS)
Goodman, R. M.; Norris, K. S.; Hobbs, L.; Gibson, R. J.; Dougherty, E.; Palladino, J.
1975-01-01
The following aspects of a research program concerned with tracking gray whales were documented: (1) design, fabrication and testing of a girdle-type harness and associated gear (release mechanism, tracking transmitter, xenon flasher), (2) design, fabrication and testing of instrumentation packs (subminiature recorder, sensor, electronics), (3) field preparations for the January-February 1974 expedition off Mexico, (4) travel arrangements, (5) preliminary field report (capture and handling of juvenile whales, instrumentation and housing tests, harness abrasion and chafing, respiration measurements, sea tracking, distribution, number, and behavior of whales at Lopez Mateos), (6) review, data reduction, and analysis of results.
Improving designer productivity
NASA Technical Reports Server (NTRS)
Hill, Gary C.
1992-01-01
Designer and design team productivity improves with skill, experience, and the tools available. The design process involves numerous trials and errors, analyses, refinements, and addition of details. Computerized tools have greatly speeded the analysis, and now new theories and methods, emerging under the label Artificial Intelligence (AI), are being used to automate skill and experience. These tools improve designer productivity by capturing experience, emulating recognized skillful designers, and making the essence of complex programs easier to grasp. This paper outlines the aircraft design process in today's technology and business climate, presenting some of the challenges ahead and some of the promising AI methods for meeting those challenges.
The purpose of this document is to introduce through a case study the use of the ground water geohydrology computer program WhAEM for Microsoft Windows (32-bit), or WhAEM2000. WhAEM2000 is a public domain, ground-water flow model designed to facilitate capture zone delineation an...
State and Regional Control of Geological Carbon Sequestration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reitze, Arnold; Durrant, Marie
2011-03-01
The United States has economically recoverable coal reserves of about 261 billion tons, which is in excess of a 250-year supply based on 2009 consumption rates. However, in the near future the use of coal may be legally restricted because of concerns over the effects of its combustion on atmospheric carbon dioxide concentrations. Carbon capture and geologic sequestration offer one method to reduce carbon emissions from coal and other hydrocarbon energy production. While the federal government is providing increased funding for carbon capture and sequestration, recent congressional legislative efforts to create a framework for regulating carbon emissions have failed. However,more » regional and state bodies have taken significant actions both to regulate carbon and facilitate its capture and sequestration. This article explores how regional bodies and state government are addressing the technical and legal problems that must be resolved in order to have a viable carbon sequestration program. Several regional bodies have formed regulations and model laws that affect carbon capture and storage, and three bodies comprising twenty-three states—the Regional Greenhouse Gas Initiative, the Midwest Regional Greenhouse Gas Reduction Accord, and the Western Climate initiative—have cap-and-trade programs in various stages of development. State property, land use and environmental laws affect the development and implementation of carbon capture and sequestration projects, and unless federal standards are imposed, state laws on torts and renewable portfolio requirements will directly affect the liability and viability of these projects. This paper examines current state laws and legislative efforts addressing carbon capture and sequestration.« less
Monitoring of adult Lost River and shortnose suckers in Clear Lake Reservoir, California, 2008–2010
Hewitt, David A.; Hayes, Brian S.
2013-01-01
Problems with inferring status and population dynamics from size composition data can be overcome by a robust capture-recapture program that follows the histories of PIT-tagged individuals. Inferences from such a program are currently hindered by poor detection rates during spawning seasons with low flows in Willow Creek, which indicate that a key assumption of capture-recapture models is violated. We suggest that the most straightforward solution to this issue would be to collect detection data during the spawning season using remote PIT tag antennas in the strait between the west and east lobes of the lake.
ERIC Educational Resources Information Center
Miller, Ann M.
A lexical representational analysis of Classical Arabic is proposed that captures a generalization that McCarthy's (1979, 1981) autosegmental analysis misses, namely that idiosyncratic characteristics of the derivational binyanim in Arabic are lexical, not morphological. This analysis captures that generalization by treating all the idiosyncracies…
Liquid biopsy on chip: a paradigm shift towards the understanding of cancer metastasis.
Tadimety, Amogha; Syed, Abeer; Nie, Yuan; Long, Christina R; Kready, Kasia M; Zhang, John X J
2017-01-23
This comprehensive review serves as a guide for developing scalable and robust liquid biopsies on chip for capture, detection, and analysis of circulating tumor cells (CTCs). Liquid biopsy, the detection of biomarkers from body fluids, has proven challenging because of CTC rarity and the heterogeneity of CTCs shed from tumors. The review starts with the underlying biological mechanisms that make liquid biopsy a challenge before moving into an evaluation of current technological progress. Then, a framework for evaluation of the technologies is presented with special attention to throughput, capture rate, and cell viability for analysis. Technologies for CTC capture, detection, and analysis will be evaluated based on these criteria, with a focus on current approaches, limitations and future directions. The paper provides a critical review for microchip developers as well as clinical investigators to build upon the existing progress towards the goal of designing CTC capture, detection, and analysis platforms.
Performance analysis of Aloha networks with power capture and near/far effect
NASA Astrophysics Data System (ADS)
McCartin, Joseph T.
1989-06-01
An analysis is presented for the throughput characteristics for several classes of Aloha packet networks. Specifically, the throughput for variable packet length Aloha utilizing multiple power levels to induce receiver capture is derived. The results are extended to an analysis of a selective-repeat ARQ Aloha network. Analytical results are presented which indicate a significant increase in throughput for a variable packet network implementing a random two power level capture scheme. Further research into the area of the near/far effect on Aloha networks is included. Improvements in throughput for mobile radio Aloha networks which are subject to the near/far effect are presented. Tactical Command, Control and Communications (C3) systems of the future will rely on Aloha ground mobile data networks. The incorporation of power capture and the near/far effect into future tactical networks will result in improved system analysis, design, and performance.
NASA Astrophysics Data System (ADS)
Williamson, V. A.; Pyrtle, A. J.
2004-12-01
How did the 2003 Minorities Striving and Pursuing Higher Degrees of Success (MS PHD'S) in Ocean Sciences Program customize evaluative methodology and instruments to align with program goals and processes? How is data captured to document cognitive and affective impact? How are words and numbers utilized to accurately illustrate programmatic outcomes? How is compliance with implicit and explicit funding regulations demonstrated? The 2003 MS PHD'S in Ocean Sciences Program case study provides insightful responses to each of these questions. MS PHD'S was developed by and for underrepresented minorities to facilitate increased and sustained participation in Earth system science. Key components of this initiative include development of a community of scholars sustained by face-to-face and virtual mentoring partnerships; establishment of networking activities between and among undergraduate, graduate, postgraduate students, scientists, faculty, professional organization representatives, and federal program officers; and provision of forums to address real world issues as identified by each constituent group. The evaluative case study of the 2003 MS PHD'S in Ocean Sciences Program consists of an analysis of four data sets. Each data set was aligned to document progress in the achievement of the following program goals: Goal 1: The MS PHD'S Ocean Sciences Program will successfully market, recruit, select, and engage underrepresented student and non-student participants with interest/ involvement in Ocean Sciences; Goal 2: The MS PHD'S Ocean Sciences Program will provide meaningful engagement for participants as determined by quantitative analysis of user-feedback; Goal 3: The MS PHD'S Ocean Sciences Program will provide meaningful engagement for participants as determined by qualitative analysis of user-feedback, and; Goal 4: The MS PHD'S Ocean Sciences Program will develop a constituent base adequate to demonstrate evidence of interest, value, need and sustainability in its vision, mission, goals and activities. In addition to the documentation of evaluative process, the case study also provides insight on the establishment of mutually supportive principal investigator and evaluator partnerships as necessary foundations for building effective teams. The study addresses frequently asked questions (FAQ's) on the formation and sustenance of partnerships among visionaries and evaluators and the impact of this partnership on the achievement of program outcomes.
Richard, Lucie; Torres, Sara; Tremblay, Marie-Claude; Chiocchio, François; Litvak, Éric; Fortin-Pellerin, Laurence; Beaudet, Nicole
2015-06-14
Professional development is a key component of effective public health infrastructures. To be successful, professional development programs in public health and health promotion must adapt to practitioners' complex real-world practice settings while preserving the core components of those programs' models and theoretical bases. An appropriate balance must be struck between implementation fidelity, defined as respecting the core nature of the program that underlies its effects, and adaptability to context to maximize benefit in specific situations. This article presents a professional development pilot program, the Health Promotion Laboratory (HPL), and analyzes how it was adapted to three different settings while preserving its core components. An exploratory analysis was also conducted to identify team and contextual factors that might have been at play in the emergence of implementation profiles in each site. This paper describes the program, its core components and adaptive features, along with three implementation experiences in local public health teams in Quebec, Canada. For each setting, documentary sources were analyzed to trace the implementation of activities, including temporal patterns throughout the project for each program component. Information about teams and their contexts/settings was obtained through documentary analysis and semi-structured interviews with HPL participants, colleagues and managers from each organization. While each team developed a unique pattern of implementing the activities, all the program's core components were implemented. Differences of implementation were observed in terms of numbers and percentages of activities related to different components of the program as well as in the patterns of activities across time. It is plausible that organizational characteristics influencing, for example, work schedule flexibility or learning culture might have played a role in the HPL implementation process. This paper shows how a professional development program model can be adapted to different contexts while preserving its core components. Capturing the heterogeneity of the intervention's exposure, as was done here, will make possible in-depth impact analyses involving, for example, the testing of program-context interactions to identify program outcomes predictors. Such work is essential to advance knowledge on the action mechanisms of professional development programs.
Development of a SPARK Training Dataset
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sayre, Amanda M.; Olson, Jarrod R.
2015-03-01
In its first five years, the National Nuclear Security Administration’s (NNSA) Next Generation Safeguards Initiative (NGSI) sponsored more than 400 undergraduate, graduate, and post-doctoral students in internships and research positions (Wyse 2012). In the past seven years, the NGSI program has, and continues to produce a large body of scientific, technical, and policy work in targeted core safeguards capabilities and human capital development activities. Not only does the NGSI program carry out activities across multiple disciplines, but also across all U.S. Department of Energy (DOE)/NNSA locations in the United States. However, products are not readily shared among disciplines and acrossmore » locations, nor are they archived in a comprehensive library. Rather, knowledge of NGSI-produced literature is localized to the researchers, clients, and internal laboratory/facility publication systems such as the Electronic Records and Information Capture Architecture (ERICA) at the Pacific Northwest National Laboratory (PNNL). There is also no incorporated way of analyzing existing NGSI literature to determine whether the larger NGSI program is achieving its core safeguards capabilities and activities. A complete library of NGSI literature could prove beneficial to a cohesive, sustainable, and more economical NGSI program. The Safeguards Platform for Automated Retrieval of Knowledge (SPARK) has been developed to be a knowledge storage, retrieval, and analysis capability to capture safeguards knowledge to exist beyond the lifespan of NGSI. During the development process, it was necessary to build a SPARK training dataset (a corpus of documents) for initial entry into the system and for demonstration purposes. We manipulated these data to gain new information about the breadth of NGSI publications, and they evaluated the science-policy interface at PNNL as a practical demonstration of SPARK’s intended analysis capability. The analysis demonstration sought to answer the question, “Who leads research and development at PNNL, scientists or policy researchers?” The analysis was inconclusive as to whether policy researchers or scientists are primary drivers for research at PNNL. However, the dataset development and analysis activity did demonstrate the utility and usability of the SPARK dataset. After the initiation of the NGSI program there is a clear increase in the number of publications of safeguards products. Employing the natural language analysis tool IN SPIRE™ showed the presence of vocation- and topic-specific vernacular within NGSI sub-topics. The methodology developed to define the scope of the dataset was useful in describing safeguards applications, but may be applicable for research on other topics beyond safeguards. The analysis emphasized the need for an expanded dataset to fully understand the scope of safeguards publications and research both nationally and internationally. As the SPARK dataset grows to include publications outside PNNL, topics crosscutting disciplines and DOE/NNSA locations should become more apparent. NGSI was established in 2008 to cultivate the next generation of safeguards professionals and support the development of core safeguards capabilities (NNSA 2012). Now a robust system to preserve and share institutional memory such as SPARK is needed to inspire and equip the next generation of safeguards experts, technologies, and policies.« less
NASA Astrophysics Data System (ADS)
Bellerive, Nathalie
The research project hypothesis is that CO2 capture and sequestration technologies (CSC) leads to a significant decrease in global warming, but increases the impact of all other aspects of the study. This is because other processes used for CO2 capture and sequestration require additional quantities of raw materials and energy. Two other objectives are described in this project. The first is the modeling of an Integrated Gasification Combined Cycle power plant for which there is no known generic data. The second is to select the right hypothesis regarding electrical production technologies, CO2 capture, compression and transportation by pipeline and finally sequestration. "Life Cycle Assessment" (LCA) analyses were chosen for this research project. LCA is an exhaustive quantitative method used to evaluate potential environmental impacts associated with a product, a service or an activity from resource extraction to waste elimination. This tool is governed by ISO 14 040 through ISO 14 049 and is sustained by the Society of Environmental Toxicology and Chemistry (SETAC) and the United Nations Environment Program (UNEP). Two power plants were studied, the Integrated Gasification Combined Cycle (IGCC) power plant and the Natural Gas Combined Cycle (NGCC) power plant. In order to sequester CO2 in geological formation, it is necessary to extract CO2from emission flows. For the IGCC power plant, CO 2 was captured before the burning phase. For the NGCC power plant, the capture was done during the afterburning phase. Once the CO2 was isolated, it was compressed and directed through a transportation pipe 1 000 km in length on the ground surface and in the sea. It is hypothesized that the power plant is 300 km from the shore and the sequestration platform 700 km from France's shore, in the North Sea. The IGCC power plant modeling and data selection regarding CO2 capture and sequestration were done by using primary data from the industry and the Ecoinvent generic database (Version 1.2). This database was selected due to its European source. Finally, technical calculations and literature were used to complete the data inventory. This was validated by electrical experts in order to increase data and modeling precision. Results were similar for IGCC and NGCC power plants using Impact 2002+, an impacts analysis method. Global warming potential decreased by 67% with the implementation of CO2 capture and sequestration compared to systems without CSC. Results for all others impacts categories, demonstrated an increase from 16% to 116% in relative proportions compared to systems without CSC. The main contributor was the additional quantity of energy required to operate CO2 capture and compression facilities. This additional energy negatively affected the power plant's global efficiency because of the increase in the quantity of fossil fuel that needed to be extracted and consumed. The increase in other impacts was mainly due to additional electricity, fossil fuel (for extracting, treatment and transportation) and additional emissions generated during power plant operations. A scenario analysis was done to study the sensitivity and variability of uncertain data during the software modeling process of a power plant. Data on power plant efficiency is the most variable and sensitive during modeling, followed by the length of the transportation pipe and the leaking rate during CO2 sequestration. This result analysis is interesting because it led to the maximum efficiency scenario with capture (with a short CO 2 transportation distance and a low leaking rate) obtaining better results on all impact category indicators, compared to the minimum efficiency scenario without capture. In fact, positive results on all category indicators were possible during the system comparison between the two cases (with and without capture). (Abstract shortened by UMI.)
Li, Peng; Gao, Yan; Pappas, Dimitri
2012-10-02
The ability to sort and capture more than one cell type from a complex sample will enable a wide variety of studies of cell proliferation and death and the analysis of disease states. In this work, we integrated a pneumatic actuated control layer to an affinity separation layer to create different antibody-coating regions on the same fluidic channel. The comparison of different antibody capture capabilities to the same cell line was demonstrated by flowing Ramos cells through anti-CD19- and anti-CD71-coated regions in the same channel. It was determined that the cell capture density on the anti-CD19 region was 2.44 ± 0.13 times higher than that on the anti-CD71-coated region. This approach can be used to test different affinity molecules for selectivity and capture efficiency using a single cell line in one separation. Selective capture of Ramos and HuT 78 cells from a mixture was also demonstrated using two antibody regions in the same channel. Greater than 90% purity was obtained on both capture areas in both continuous flow and stop flow separation modes. A four-region antibody-coated device was then fabricated to study the simultaneous, serial capture of three different cell lines. In this case the device showed effective capture of cells in a single separation channel, opening up the possibility of multiple cell sorting. Multiparameter sequential blood sample analysis was also demonstrated with high capture specificity (>97% for both CD19+ and CD4+ leukocytes). The chip can also be used to selectively treat cells after affinity separation.
Linking animal-borne video to accelerometers reveals prey capture variability
Watanabe, Yuuki Y.; Takahashi, Akinori
2013-01-01
Understanding foraging is important in ecology, as it determines the energy gains and, ultimately, the fitness of animals. However, monitoring prey captures of individual animals is difficult. Direct observations using animal-borne videos have short recording periods, and indirect signals (e.g., stomach temperature) are never validated in the field. We took an integrated approach to monitor prey captures by a predator by deploying a video camera (lasting for 85 min) and two accelerometers (on the head and back, lasting for 50 h) on free-swimming Adélie penguins. The movies showed that penguins moved the heads rapidly to capture krill in midwater and fish (Pagothenia borchgrevinki) underneath the sea ice. Captures were remarkably fast (two krill per second in swarms) and efficient (244 krill or 33 P. borchgrevinki in 78–89 min). Prey captures were detected by the signal of head acceleration relative to body acceleration with high sensitivity and specificity (0.83–0.90), as shown by receiver-operating characteristic analysis. Extension of signal analysis to the entire behavioral records showed that krill captures were spatially and temporally more variable than P. borchgrevinki captures. Notably, the frequency distribution of krill capture rate closely followed a power-law model, indicating that the foraging success of penguins depends on a small number of very successful dives. The three steps illustrated here (i.e., video observations, linking video to behavioral signals, and extension of signal analysis) are unique approaches to understanding the spatial and temporal variability of ecologically important events such as foraging. PMID:23341596
INEL BNCT Program: Volume 5, No. 9. Bulletin, September 1991
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ackermann, A.L.
1991-12-31
This Bulletin presents a summary of accomplishments and highlights of the Idaho National Engineering Laboratory`s (INEL) Boron Neutron Capture Therapy (BNCT) Program for September 1991. This bulletin includes information on the brain tumor and melanoma research programs, Power Burst Facility (PBF) technical support and modifications, PBF operations, and updates to the animal data charts.
ERIC Educational Resources Information Center
Camaioni, Nicole
2013-01-01
The overall purpose of this study was to capture the relationships made during the Campus Canines Program, an animal-assisted activity program, at the University of Pittsburgh. Meaningful social relationships create greater educational satisfaction. These social relationships are an important piece to creating and sustaining student involvement,…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hawkes, Lynnette A.; Martinson, Rick D.; Absolon, Randall F.
1993-05-01
The seaward migration of salmonid smolts was monitored by the National marine Fisheries Service (NMFS) at two sites on the Columbia River in 1992. The NMFS Smolt Monitoring Project is part of a larger Smolt Monitoring Program to index Columbia Basin juvenile salmonied stocks. It is coordinated by the Fish Passage Center (FPC) for the Columbia Basin Fish and Wildlife Agencies and Tribes. Its purpose is to facilitate fish passage through reservoirs and at dams by providing FPC with timely smolt migration data used for flow and spill management. Data is also used for travel time, migration timing and relativemore » run size magnitude analysis. This program is carried out under the auspices of the Northwest Power Planning Council Fish and Wildlife Program and is funded by the Bonneville Power Administration (BPA). Sampling sites were John Day and Bonneville Dams under the 1992 Smolt Monitoring Program. All pertinent fish capture, condition, brand recovery, and flow data, were reported daily to FPC. These data were incorporated into the FPC`s Fish Passage Data System (FPDS).« less
Ainalem, Ingrid; Berg, Agneta; Janlöv, Ann-Christin
2016-01-01
The aim of this study was to describe health care- and social service professionals' experiences of a quality-improvement program implemented in the south of Sweden. The focus of the program was to develop inter-professional collaboration to improve care and service to people with psychiatric disabilities in ordinary housing. Focus group interviews and a thematic analysis were used. The result was captured as themes along steps in process. (I) Entering the quality-improvement program: Lack of information about the program, The challenge of getting started, and Approaching the resources reluctantly. (II) Doing the practice-based improvement work: Facing unprepared workplaces, and Doing twice the work. (III) Looking back—evaluation over 1 year: Balancing theoretical knowledge with practical training, and Considering profound knowledge as an integral part of work. The improvement process in clinical practice was found to be both time and energy consuming, yet worth the effort. The findings also indicate that collaboration across organizational boundaries was broadened, and the care and service delivery were improved. PMID:26783867
Checkmate: Capturing Gifted Students' Logical Thinking Using Chess.
ERIC Educational Resources Information Center
Rifner, Philip J.; Feldhusen, John F.
1997-01-01
Describes the use of chess instruction to develop abstract thinking skills and problem solving among gifted students. Offers suggestions for starting school chess programs, teaching and evaluating chess skills, and measuring the success of both student-players and the program in general. (PB)
The field analytical screening program (FASP) polychlorinated biphenyl (PCB) method uses a temperature-programmable gas chromatograph (GC) equipped with an electron capture detector (ECD) to identify and quantify PCBs. Gas chromatography is an EPA-approved method for determi...
FIELD ANALYTICAL SCREENING PROGRAM: PCP METHOD - INNOVATIVE TECHNOLOGY EVALUATION REPORT
The Field Analytical Screening Program (FASP) pentachlorophenol (PCP) method uses a gas chromatograph (GC) equipped with a megabore capillary column and flame ionization detector (FID) and electron capture detector (ECD) to identify and quantify PCP. The FASP PCP method is design...
Motion Analysis System for Instruction of Nihon Buyo using Motion Capture
NASA Astrophysics Data System (ADS)
Shinoda, Yukitaka; Murakami, Shingo; Watanabe, Yuta; Mito, Yuki; Watanuma, Reishi; Marumo, Mieko
The passing on and preserving of advanced technical skills has become an important issue in a variety of fields, and motion analysis using motion capture has recently become popular in the research of advanced physical skills. This research aims to construct a system having a high on-site instructional effect on dancers learning Nihon Buyo, a traditional dance in Japan, and to classify Nihon Buyo dancing according to style, school, and dancer's proficiency by motion analysis. We have been able to study motion analysis systems for teaching Nihon Buyo now that body-motion data can be digitized and stored by motion capture systems using high-performance computers. Thus, with the aim of developing a user-friendly instruction-support system, we have constructed a motion analysis system that displays a dancer's time series of body motions and center of gravity for instructional purposes. In this paper, we outline this instructional motion analysis system based on three-dimensional position data obtained by motion capture. We also describe motion analysis that we performed based on center-of-gravity data obtained by this system and motion analysis focusing on school and age group using this system.
Computerized Spiral Analysis Using the iPad
Sisti, Jonathan A.; Christophe, Brandon; Seville, Audrey Rakovich; Garton, Andrew L.A.; Gupta, Vivek P.; Bandin, Alexander J.; Yu, Qiping; Pullman, Seth L.
2017-01-01
Background Digital analysis of writing and drawing has become a valuable research and clinical tool for the study of upper limb motor dysfunction in patients with essential tremor, Parkinson’s disease, dystonia, and related disorders. We developed a validated method of computerized spiral analysis of hand-drawn Archimedean spirals that provides insight into movement dynamics beyond subjective visual assessment using a Wacom graphics tablet. While the Wacom tablet method provides robust data, more widely available mobile technology platforms exist. New Method We introduce a novel adaptation of the Wacom-based method for the collection of hand-drawn kinematic data using an Apple iPad. This iPad-based system is stand-alone, easy-to-use, can capture drawing data with either a finger or capacitive stylus, is precise, and potentially ubiquitous. Results The iPad-based system acquires position and time data that is fully compatible with our original spiral analysis program. All of the important indices including degree of severity, speed, presence of tremor, tremor amplitude, tremor frequency, variability of pressure, and tightness are calculated from the digital spiral data, which the application is able to transmit. Comparison with Existing Method While the iPad method is limited by current touch screen technology, it does collect data with acceptable congruence compared to the current Wacom-based method while providing the advantages of accessibility and ease of use. Conclusions The iPad is capable of capturing precise digital spiral data for analysis of motor dysfunction while also providing a convenient, easy-to-use modality in clinics and potentially at home. PMID:27840146
Computerized spiral analysis using the iPad.
Sisti, Jonathan A; Christophe, Brandon; Seville, Audrey Rakovich; Garton, Andrew L A; Gupta, Vivek P; Bandin, Alexander J; Yu, Qiping; Pullman, Seth L
2017-01-01
Digital analysis of writing and drawing has become a valuable research and clinical tool for the study of upper limb motor dysfunction in patients with essential tremor, Parkinson's disease, dystonia, and related disorders. We developed a validated method of computerized spiral analysis of hand-drawn Archimedean spirals that provides insight into movement dynamics beyond subjective visual assessment using a Wacom graphics tablet. While the Wacom tablet method provides robust data, more widely available mobile technology platforms exist. We introduce a novel adaptation of the Wacom-based method for the collection of hand-drawn kinematic data using an Apple iPad. This iPad-based system is stand-alone, easy-to-use, can capture drawing data with either a finger or capacitive stylus, is precise, and potentially ubiquitous. The iPad-based system acquires position and time data that is fully compatible with our original spiral analysis program. All of the important indices including degree of severity, speed, presence of tremor, tremor amplitude, tremor frequency, variability of pressure, and tightness are calculated from the digital spiral data, which the application is able to transmit. While the iPad method is limited by current touch screen technology, it does collect data with acceptable congruence compared to the current Wacom-based method while providing the advantages of accessibility and ease of use. The iPad is capable of capturing precise digital spiral data for analysis of motor dysfunction while also providing a convenient, easy-to-use modality in clinics and potentially at home. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Bechtel, R. D.; Mateos, M. A.; Lincoln, K. A.
1988-01-01
Briefly described are the essential features of a computer program designed to interface a personal computer with the fast, digital data acquisition system of a time-of-flight mass spectrometer. The instrumentation was developed to provide a time-resolved analysis of individual vapor pulses produced by the incidence of a pulsed laser beam on an ablative material. The high repetition rate spectrometer coupled to a fast transient recorder captures complete mass spectra every 20 to 35 microsecs, thereby providing the time resolution needed for the study of this sort of transient event. The program enables the computer to record the large amount of data generated by the system in short time intervals, and it provides the operator the immediate option of presenting the spectral data in several different formats. Furthermore, the system does this with a high degree of automation, including the tasks of mass labeling the spectra and logging pertinent instrumental parameters.
U.S. Spacesuit Knowledge Capture Accomplishments in Fiscal Years 2012 and 2013
NASA Technical Reports Server (NTRS)
Chullen, Cinda; Oliva, Vladenka R.
2014-01-01
The NASA U.S. Spacesuit Knowledge Capture (KC) program has existed since the beginning of 2008. The program was designed to augment engineers and other technical team members with historical spacesuit information to add to their understanding of the spacesuit, its evolution, its limitations, and its capabilities. Over 40 seminars have captured spacesuit history and knowledge over the last six years of the program's existence. Subject matter experts have provided lectures and some were interviewed to help bring the spacesuit to life so that lessons learned will never be lost. As well, the program concentrated in reaching out to the public and industry by making the recorded events part of the public domain through the NASA technical library through YouTube media. The U.S. Spacesuit KC topics have included lessons learned from some of the most prominent spacesuit experts and spacesuit users including current and former astronauts. The events have enriched the spacesuit legacy knowledge from Gemini, Apollo, Skylab, Space Shuttle and International Space Station Programs. As well, expert engineers and scientists have shared their challenges and successes to be remembered. Based on evidence by the thousands of people who have viewed the recordings online, the last few years have been some of the most successful years of the KC program's life with numerous digital recordings and public releases. This paper reviews the events accomplished and archived over Fiscal Years 2012 and 2013 and highlights a few of the most memorable ones. This paper also communicates ways to access the events that are available internally on the NASA domain as well as those released on the public domain.
Bordeianou, Liliana; Cauley, Christy E; Antonelli, Donna; Bird, Sarah; Rattner, David; Hutter, Matthew; Mahmood, Sadiqa; Schnipper, Deborah; Rubin, Marc; Bleday, Ronald; Kenney, Pardon; Berger, David
2017-01-01
Two systems measure surgical site infection rates following colorectal surgeries: the American College of Surgeons National Surgical Quality Improvement Program and the Centers for Disease Control and Prevention National Healthcare Safety Network. The Centers for Medicare & Medicaid Services pay-for-performance initiatives use National Healthcare Safety Network data for hospital comparisons. This study aimed to compare database concordance. This is a multi-institution cohort study of systemwide Colorectal Surgery Collaborative. The National Surgical Quality Improvement Program requires rigorous, standardized data capture techniques; National Healthcare Safety Network allows 5 data capture techniques. Standardized surgical site infection rates were compared between databases. The Cohen κ-coefficient was calculated. This study was conducted at Boston-area hospitals. National Healthcare Safety Network or National Surgical Quality Improvement Program patients undergoing colorectal surgery were included. Standardized surgical site infection rates were the primary outcomes of interest. Thirty-day surgical site infection rates of 3547 (National Surgical Quality Improvement Program) vs 5179 (National Healthcare Safety Network) colorectal procedures (2012-2014). Discrepancies appeared: National Surgical Quality Improvement Program database of hospital 1 (N = 1480 patients) routinely found surgical site infection rates of approximately 10%, routinely deemed rate "exemplary" or "as expected" (100%). National Healthcare Safety Network data from the same hospital and time period (N = 1881) revealed a similar overall surgical site infection rate (10%), but standardized rates were deemed "worse than national average" 80% of the time. Overall, hospitals using less rigorous capture methods had improved surgical site infection rates for National Healthcare Safety Network compared with standardized National Surgical Quality Improvement Program reports. The correlation coefficient between standardized infection rates was 0.03 (p = 0.88). During 25 site-time period observations, National Surgical Quality Improvement Program and National Healthcare Safety Network data matched for 52% of observations (13/25). κ = 0.10 (95% CI, -0.1366 to 0.3402; p = 0.403), indicating poor agreement. This study investigated hospitals located in the Northeastern United States only. Variation in Centers for Medicare & Medicaid Services-mandated National Healthcare Safety Network infection surveillance methodology leads to unreliable results, which is apparent when these results are compared with standardized data. High-quality data would improve care quality and compare outcomes among institutions.
McCarthy, Robert J; Levine, Stephen H; Reed, J Michael
2013-08-15
To predict effectiveness of 3 interventional methods of population control for feral cat colonies. Population model. Estimates of vital data for feral cats. Data were gathered from the literature regarding the demography and mating behavior of feral cats. An individual-based stochastic simulation model was developed to evaluate the effectiveness of trap-neuter-release (TNR), lethal control, and trap-vasectomy-hysterectomy-release (TVHR) in decreasing the size of feral cat populations. TVHR outperformed both TNR and lethal control at all annual capture probabilities between 10% and 90%. Unless > 57% of cats were captured and neutered annually by TNR or removed by lethal control, there was minimal effect on population size. In contrast, with an annual capture rate of ≥ 35%, TVHR caused population size to decrease. An annual capture rate of 57% eliminated the modeled population in 4,000 days by use of TVHR, whereas > 82% was required for both TNR and lethal control. When the effect of fraction of adult cats neutered on kitten and young juvenile survival rate was included in the analysis, TNR performed progressively worse and could be counterproductive, such that population size increased, compared with no intervention at all. TVHR should be preferred over TNR for management of feral cats if decrease in population size is the goal. This model allowed for many factors related to the trapping program and cats to be varied and should be useful for determining the financial and person-effort commitments required to have a desired effect on a given feral cat population.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-04
... USFWS initiated a capture-based research program starting in 2008 on the sea ice off the Chukchi Sea coastline. Captures occur on the sea ice up to 100 mi (161 km) offshore of the Alaskan coastline between Shishmaref and Cape Lisburne (see Figure 1 in the USFWS' application). Take of ice seals may occur when the...
THE RADIATIVE NEUTRON CAPTURE ON 2H, 6Li, 7Li, 12C AND 13C AT ASTROPHYSICAL ENERGIES
NASA Astrophysics Data System (ADS)
Dubovichenko, Sergey; Dzhazairov-Kakhramanov, Albert; Burkova, Natalia
2013-05-01
The continued interest in the study of radiative neutron capture on atomic nuclei is due, on the one hand, to the important role played by this process in the analysis of many fundamental properties of nuclei and nuclear reactions, and, on the other hand, to the wide use of the capture cross-section data in the various applications of nuclear physics and nuclear astrophysics, and, also, to the importance of the analysis of primordial nucleosynthesis in the Universe. This paper is devoted to the description of results for the processes of the radiative neutron capture on certain light atomic nuclei at thermal and astrophysical energies. The consideration of these processes is done within the framework of the potential cluster model (PCM), general description of which was given earlier. The methods of usage of the results obtained, based on the phase shift analysis intercluster potentials, are demonstrated in calculations of the radiative capture characteristics. The considered capture reactions are not part of stellar thermonuclear cycles, but involve in the basic reaction chain of primordial nucleosynthesis in the course of the Universe formation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miebach, Barbara; McDuffie, Dwayne; Spiry, Irina
The objective of this project is to design and build a bench-scale process for a novel phase-changing CO 2 capture solvent. The project will establish scalability and technical and economic feasibility of using a phase-changing CO 2 capture absorbent for post-combustion capture of CO 2 from coal-fired power plants with 90% capture efficiency and 95% CO 2 purity at a cost of $40/tonne of CO 2 captured by 2025 and a cost of <$10/tonne of CO 2 captured by 2035. This report presents system and economic analysis for a process that uses a phase changing aminosilicone solvent to remove COmore » 2 from pulverized coal (PC) power plant flue gas. The aminosilicone solvent is a pure 1,3-bis(3-aminopropyl)-1,1,3,3-tetramethyldisiloxane (GAP-0). Performance of the phase-changing aminosilicone technology is compared to that of a conventional carbon capture system using aqueous monoethanolamine (MEA). This analysis demonstrates that the aminosilicone process has significant advantages relative to an MEA-based system. The first-year CO 2 removal cost for the phase-changing CO 2 capture process is $52.1/tonne, compared to $66.4/tonne for the aqueous amine process. The phase-changing CO 2 capture process is less costly than MEA because of advantageous solvent properties that include higher working capacity, lower corrosivity, lower vapor pressure, and lower heat capacity. The phase-changing aminosilicone process has approximately 32% lower equipment capital cost compared to that of the aqueous amine process. However, this solvent is susceptible to thermal degradation at CSTR desorber operating temperatures, which could add as much as $88/tonne to the CO 2 capture cost associated with solvent makeup. Future work is focused on mitigating this critical risk by developing an advanced low-temperature desorber that can deliver comparable desorption performance and significantly reduced thermal degradation rate.« less
CO 2 Capture by Cold Membrane Operation with Actual Power Plant Flue Gas
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chaubey, Trapti; Kulkarni, Sudhir; Hasse, David
The main objective of the project was to develop a post-combustion CO 2 capture process based on the hybrid cold temperature membrane operation. The CO 2 in the flue gas from coal fired power plant is pre-concentrated to >60% CO 2 in the first stage membrane operation followed by further liquefaction of permeate stream to achieve >99% CO 2 purity. The aim of the project was based on DOE program goal of 90% CO 2 capture with >95% CO 2 purity from Pulverized Coal (PC) fired power plants with $40/tonne of carbon capture cost by 2025. The project moves themore » technology from TRL 4 to TRL 5. The project involved optimization of Air Liquide commercial 12” PI-1 bundle to improve the bundle productivity by >30% compared to the previous baseline (DE-FE0004278) using computational fluid dynamics (CFD) modeling and bundle testing with synthetic flue gas at 0.1 MWe bench scale skid located at Delaware Research and Technology Center (DRTC). In parallel, the next generation polyimide based novel PI-2 membrane was developed with 10 times CO 2 permeance compared to the commercial PI-1 membrane. The novel PI-2 membrane was scaled from mini-permeator to 1” permeator and 1” bundle for testing. Bundle development was conducted with a Development Spin Unit (DSU) installed at MEDAL. Air Liquide’s cold membrane technology was demonstrated with real coal fired flue gas at the National Carbon Capture Center (NCCC) with a 0.3 MWe field-test unit (FTU). The FTU was designed to incorporate testing of two PI-1 commercial membrane bundles (12” or 6” diameter) in parallel or series. A slip stream was sent to the next generation PI-2 membrane for testing with real flue gas. The system exceeded performance targets with stable PI-1 membrane operation for over 500 hours of single bundle, steady state testing. The 12” PI-1 bundle exceeded the productivity target by achieving ~600 Nm3/hr, where the target was set at ~455 Nm3/hr at 90% capture rate. The cost of 90% CO 2 capture from a 550 MWe net coal power plant was estimated between 40 and $45/tonne. A 6” PI-1 bundle exhibited superior bundle performance compared to the 12” PI-1 bundle. However, the carbon capture cost was not lower with the 6” PI-1 bundle due to the higher bundle installed cost. A 1” PI-1 bundle was tested to compare bundles with different length / diameter ratios. This bundle exhibited the lowest performance due to the different fiber winding pattern and increased bundle non-ideality. Several long-term and parametric tests were conducted with 3,200 hours of total run-time at NCCC. Finally, the new PI-2 membrane fiber was tested at a small scale (1” modules) in real flue gas and exhibited up to 10 times the CO 2 permeance and slightly lower CO 2/N 2 selectivity as the commercial PI-1 fiber. This corresponded to a projected 4 - 5 times increase in the productivity per bundle and a potential cost reduction of $3/tonne for CO2 capture, as compared with PI-1. An analytical campaign was conducted to trace different impurities such as NOx, mercury, Arsenic, Selenium in gas and liquid samples through the carbon capture system. An Environmental, Health and Safety (EH&S) analysis was completed to estimate emissions from a 550 MWe net power plant with carbon capture using cold membrane. A preliminary design and cost analysis was completed for 550 tpd (~25 MWe) plant to assess the capital investment and carbon capture cost for PI-1 and PI-2 membrane solutions from coal fired flue gas. A comparison was made with an amine based solution with significant cost advantage for the membrane at this scale. Additional preliminary design and cost analysis was completed between coal, natural gas and SMR flue gas for carbon capture at 550 tpd (~25 MWe) plant.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hasenkamp, Daren; Sim, Alexander; Wehner, Michael
Extensive computing power has been used to tackle issues such as climate changes, fusion energy, and other pressing scientific challenges. These computations produce a tremendous amount of data; however, many of the data analysis programs currently only run a single processor. In this work, we explore the possibility of using the emerging cloud computing platform to parallelize such sequential data analysis tasks. As a proof of concept, we wrap a program for analyzing trends of tropical cyclones in a set of virtual machines (VMs). This approach allows the user to keep their familiar data analysis environment in the VMs, whilemore » we provide the coordination and data transfer services to ensure the necessary input and output are directed to the desired locations. This work extensively exercises the networking capability of the cloud computing systems and has revealed a number of weaknesses in the current cloud system software. In our tests, we are able to scale the parallel data analysis job to a modest number of VMs and achieve a speedup that is comparable to running the same analysis task using MPI. However, compared to MPI based parallelization, the cloud-based approach has a number of advantages. The cloud-based approach is more flexible because the VMs can capture arbitrary software dependencies without requiring the user to rewrite their programs. The cloud-based approach is also more resilient to failure; as long as a single VM is running, it can make progress while as soon as one MPI node fails the whole analysis job fails. In short, this initial work demonstrates that a cloud computing system is a viable platform for distributed scientific data analyses traditionally conducted on dedicated supercomputing systems.« less
Steinhaus, Daniel A; Waks, Jonathan W; Collins, Robert; Kleckner, Karen; Kramer, Daniel B; Zimetbaum, Peter J
2015-07-01
Device longevity in cardiac resynchronization therapy (CRT) is affected by the pacing capture threshold (PCT) and programmed pacing amplitude of the left ventricular (LV) pacing lead. The aims of this study were to evaluate the stability of LV pacing thresholds in a nationwide sample of CRT defibrillator recipients and to determine potential longevity improvements associated with a decrease in the LV safety margin while maintaining effective delivery of CRT. CRT defibrillator patients in the Medtronic CareLink database were eligible for inclusion. LV PCT stability was evaluated using ≥2 measurements over a 14-day period. Separately, a random sample of 7,250 patients with programmed right atrial and right ventricular amplitudes ≤2.5 V, LV thresholds ≤ 2.5 V, and LV pacing ≥90% were evaluated to estimate theoretical battery longevity improvement using LV safety margins of 0.5 and 1.5 V. Threshold stability analysis in 43,256 patients demonstrated LV PCT stability of <0.5 V in 77% of patients and <1 V in 95%. Device longevity analysis showed that the use of a 0.5-V safety margin increased average battery longevity by 0.62 years (95% confidence interval 0.61 to 0.63) compared with a safety margin of 1.5 V. Patients with LV PCTs >1 V had the greatest increases in battery life (mean increase 0.86 years, 95% confidence interval 0.85 to 0.87). In conclusion, nearly all CRT defibrillator patients had LV PCT stability <1.0 V. Decreasing the LV safety margin from 1.5 to 0.5 V provided consistent delivery of CRT for most patients and significantly improved battery longevity. Copyright © 2015 Elsevier Inc. All rights reserved.
Dente, Christopher J; Ashley, Dennis W; Dunne, James R; Henderson, Vernon; Ferdinand, Colville; Renz, Barry; Massoud, Romeo; Adamski, John; Hawke, Thomas; Gravlee, Mark; Cascone, John; Paynter, Steven; Medeiros, Regina; Atkins, Elizabeth; Nicholas, Jeffrey M
2016-03-01
Led by the American College of Surgeons Trauma Quality Improvement Program, performance improvement efforts have expanded to regional and national levels. The American College of Surgeons Trauma Quality Improvement Program recommends 5 audit filters to identify records with erroneous data, and the Georgia Committee on Trauma instituted standardized audit filter analysis in all Level I and II trauma centers in the state. Audit filter reports were performed from July 2013 to September 2014. Records were reviewed to determine whether there was erroneous data abstraction. Percent yield was defined as number of errors divided by number of charts captured. Twelve centers submitted complete datasets. During 15 months, 21,115 patient records were subjected to analysis. Audit filter captured 2,901 (14%) records and review yielded 549 (2.5%) records with erroneous data. Audit filter 1 had the highest number of records identified and audit filter 3 had the highest percent yield. Individual center error rates ranged from 0.4% to 5.2%. When comparing quarters 1 and 2 with quarters 4 and 5, there were 7 of 12 centers with substantial decreases in error rates. The most common missed complications were pneumonia, urinary tract infection, and acute renal failure. The most common missed comorbidities were hypertension, diabetes, and substance abuse. In Georgia, the prevalence of erroneous data in trauma registries varies among centers, leading to heterogeneity in data quality, and suggests that targeted educational opportunities exist at the institutional level. Standardized audit filter assessment improved data quality in the majority of participating centers. Copyright © 2016 American College of Surgeons. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Clark, Susan G.; Rutherford, Murray B.; Auer, Matthew R.; Cherney, David N.; Wallace, Richard L.; Mattson, David J.; Clark, Douglas A.; Foote, Lee; Krogman, Naomi; Wilshusen, Peter; Steelman, Toddi
2011-05-01
Environmental studies and environmental sciences programs in American and Canadian colleges and universities seek to ameliorate environmental problems through empirical enquiry and analytic judgment. In a companion article (Part 1) we describe the environmental program movement (EPM) and discuss factors that have hindered its performance. Here, we complete our analysis by proposing strategies for improvement. We recommend that environmental programs re-organize around three principles. First, adopt as an overriding goal the concept of human dignity—defined as freedom and social justice in healthy, sustainable environments. This clear higher-order goal captures the human and environmental aspirations of the EPM and would provide a more coherent direction for the efforts of diverse participants. Second, employ an explicit, genuinely interdisciplinary analytical framework that facilitates the use of multiple methods to investigate and address environmental and social problems in context. Third, develop educational programs and applied experiences that provide students with the technical knowledge, powers of observation, critical thinking skills and management acumen required for them to become effective professionals and leaders. Organizing around these three principles would build unity in the EPM while at the same time capitalizing on the strengths of the many disciplines and diverse local conditions involved.
Hartwig, Sophie A; Robinson, Lara R; Comeau, Dawn L; Claussen, Angelika H; Perou, Ruth
2017-07-01
This article presents the findings of a qualitative study of maternal perceptions of parenting following participation in Legacy for Children TM (Legacy), an evidence-based parenting program for low-income mothers of young children and infants. To further examine previous findings and better understand participant experiences, we analyzed semistructured focus-group discussions with predominantly Hispanic and Black, non-Hispanic Legacy mothers at two sites (n = 166) using thematic analysis and grounded theory techniques. The qualitative study presented here investigated how mothers view their parenting following participation in Legacy, allowing participants to describe their experience with the program in their own words, thus capturing an "insider" perspective. Mothers at both sites communicated knowledge and use of positive parenting practices targeted by the goals of Legacy; some site-specific differences emerged related to these parenting practices. These findings align with the interpretation of quantitative results from the randomized controlled trials and further demonstrate the significance of the Legacy program in promoting positive parenting for mothers living in poverty. This study emphasizes the importance of understanding real-world context regarding program efficacy and the benefit of using qualitative research to understand participant experiences. © 2017 Michigan Association for Infant Mental Health.
Wings in Orbit: Scientific and Engineering Legacies of the Space Shuttle, 1971-2010
NASA Technical Reports Server (NTRS)
Hale, Wayne (Editor); Lane, Helen (Editor); Chapline, Gail (Editor); Lulla, Kamlesh (Editor)
2011-01-01
The Space Shuttle is an engineering marvel perhaps only exceeded by the station itself. The shuttle was based on the technology of the 1960s and early 1970s. It had to overcome significant challenges to make it reusable. Perhaps the greatest challenges were the main engines and the Thermal Protection System. The program has seen terrible tragedy in its 3 decades of operation, yet it has also seen marvelous success. One of the most notable successes is the Hubble Space Telescope, a program that would have been a failure without the shuttle's capability to rendezvous, capture, repair, as well as upgrade. Now Hubble is a shining example of success admired by people around the world. As the program comes to a close, it is important to capture the legacy of the shuttle for future generations. That is what "Wings In Orbit" does for space fans, students, engineers, and scientists. This book, written by the men and women who made the program possible, will serve as an excellent reference for building future space vehicles. We are proud to have played a small part in making it happen. Our journey to document the scientific and engineering accomplishments of this magnificent winged vehicle began with an audacious proposal: to capture the passion of those who devoted their energies to its success while answering the question "What are the most significant accomplishments?" of the longestoperating human spaceflight program in our nation s history. This is intended to be an honest, accurate, and easily understandable account of the research and innovation accomplished during the era.
Engineered Resilient Systems: Knowledge Capture and Transfer
2014-08-29
development, but the work has not progressed significantly. 71 Peter Kall and Stein W. Wallace, Stochastic Programming, John Wiley & Sons, Chichester, 1994...John Wiley and Sons: Hoboken, 2008. Peter Kall and Stein W. Wallace, Stochastic Programming, John Wiley & Sons, Chichester, 1994. Rhodes, D.H., Lamb
NASA Astrophysics Data System (ADS)
Roederer, Ian U.; Karakas, Amanda I.; Pignatari, Marco; Herwig, Falk
2016-04-01
We present a detailed analysis of the composition and nucleosynthetic origins of the heavy elements in the metal-poor ([Fe/H] = -1.62 ± 0.09) star HD 94028. Previous studies revealed that this star is mildly enhanced in elements produced by the slow neutron-capture process (s process; e.g., [Pb/Fe] = +0.79 ± 0.32) and rapid neutron-capture process (r process; e.g., [Eu/Fe] = +0.22 ± 0.12), including unusually large molybdenum ([Mo/Fe] = +0.97 ± 0.16) and ruthenium ([Ru/Fe] = +0.69 ± 0.17) enhancements. However, this star is not enhanced in carbon ([C/Fe] = -0.06 ± 0.19). We analyze an archival near-ultraviolet spectrum of HD 94028, collected using the Space Telescope Imaging Spectrograph on board the Hubble Space Telescope, and other archival optical spectra collected from ground-based telescopes. We report abundances or upper limits derived from 64 species of 56 elements. We compare these observations with s-process yields from low-metallicity AGB evolution and nucleosynthesis models. No combination of s- and r-process patterns can adequately reproduce the observed abundances, including the super-solar [As/Ge] ratio (+0.99 ± 0.23) and the enhanced [Mo/Fe] and [Ru/Fe] ratios. We can fit these features when including an additional contribution from the intermediate neutron-capture process (I process), which perhaps operated through the ingestion of H in He-burning convective regions in massive stars, super-AGB stars, or low-mass AGB stars. Currently, only the I process appears capable of consistently producing the super-solar [As/Ge] ratios and ratios among neighboring heavy elements found in HD 94028. Other metal-poor stars also show enhanced [As/Ge] ratios, hinting that operation of the I process may have been common in the early Galaxy. These data are associated with Program 072.B-0585(A), PI. Silva. Some data presented in this paper were obtained from the Barbara A. Mikulski Archive for Space Telescopes (MAST). The Space Telescope Science Institute is operated by the Association of Universities for Research in Astronomy, Inc., under NASA contract NAS5-26555. These data are associated with Programs GO-7402 and GO-8197. This work is based on data obtained from the European Southern Observatory (ESO) Science Archive Facility. These data are associated with Program 072.B-0585(A). This paper includes data taken at The McDonald Observatory of The University of Texas at Austin.
Samuel, Jonathan C; Sankhulani, Edward; Qureshi, Javeria S; Baloyi, Paul; Thupi, Charles; Lee, Clara N; Miller, William C; Cairns, Bruce A; Charles, Anthony G
2012-01-01
Road traffic injuries are a major cause of preventable death in sub-Saharan Africa. Accurate epidemiologic data are scarce and under-reporting from primary data sources is common. Our objectives were to estimate the incidence of road traffic deaths in Malawi using capture-recapture statistical analysis and determine what future efforts will best improve upon this estimate. Our capture-recapture model combined primary data from both police and hospital-based registries over a one year period (July 2008 to June 2009). The mortality incidences from the primary data sources were 0.075 and 0.051 deaths/1000 person-years, respectively. Using capture-recapture analysis, the combined incidence of road traffic deaths ranged 0.192-0.209 deaths/1000 person-years. Additionally, police data were more likely to include victims who were male, drivers or pedestrians, and victims from incidents with greater than one vehicle involved. We concluded that capture-recapture analysis is a good tool to estimate the incidence of road traffic deaths, and that capture-recapture analysis overcomes limitations of incomplete data sources. The World Health Organization estimated incidence of road traffic deaths for Malawi utilizing a binomial regression model and survey data and found a similar estimate despite strikingly different methods, suggesting both approaches are valid. Further research should seek to improve capture-recapture data through utilization of more than two data sources and improving accuracy of matches by minimizing missing data, application of geographic information systems, and use of names and civil registration numbers if available.
Samuel, Jonathan C.; Sankhulani, Edward; Qureshi, Javeria S.; Baloyi, Paul; Thupi, Charles; Lee, Clara N.; Miller, William C.; Cairns, Bruce A.; Charles, Anthony G.
2012-01-01
Road traffic injuries are a major cause of preventable death in sub-Saharan Africa. Accurate epidemiologic data are scarce and under-reporting from primary data sources is common. Our objectives were to estimate the incidence of road traffic deaths in Malawi using capture-recapture statistical analysis and determine what future efforts will best improve upon this estimate. Our capture-recapture model combined primary data from both police and hospital-based registries over a one year period (July 2008 to June 2009). The mortality incidences from the primary data sources were 0.075 and 0.051 deaths/1000 person-years, respectively. Using capture-recapture analysis, the combined incidence of road traffic deaths ranged 0.192–0.209 deaths/1000 person-years. Additionally, police data were more likely to include victims who were male, drivers or pedestrians, and victims from incidents with greater than one vehicle involved. We concluded that capture-recapture analysis is a good tool to estimate the incidence of road traffic deaths, and that capture-recapture analysis overcomes limitations of incomplete data sources. The World Health Organization estimated incidence of road traffic deaths for Malawi utilizing a binomial regression model and survey data and found a similar estimate despite strikingly different methods, suggesting both approaches are valid. Further research should seek to improve capture-recapture data through utilization of more than two data sources and improving accuracy of matches by minimizing missing data, application of geographic information systems, and use of names and civil registration numbers if available. PMID:22355338
Squara, Fabien; Liuba, Ioan; Chik, William; Santangeli, Pasquale; Zado, Erica S; Callans, David J; Marchlinski, Francis E
2015-03-01
Capture of the myocardial sleeves of the pulmonary veins (PV) during PV pacing is mandatory for assessing exit block after PV isolation (PVI). However, previous studies reported that a significant proportion of PVs failed to demonstrate local capture after PVI. We designed this study to evaluate the prevalence and the clinical significance of loss of PV capture after PVI. Thirty patients (14 redo) undergoing antral PVI were included. Before and after PVI, local PV capture was assessed during circumferential pacing (10 mA/2 milliseconds) with a circular multipolar catheter (CMC), using EGM analysis from each dipole of the CMC and from the ablation catheter placed in ipsilateral PV. Pacing output was varied to optimize identification of sleeve capture. All PVs demonstrated sleeve capture before PVI, but only 81% and 40% after first time and redo PVI, respectively (P < 0.001 vs. before PVI). In multivariate analysis, absence of spontaneous PV depolarizations after PVI and previous PVI procedures were associated with less PV sleeve capture after PVI (40% sleeve capture, P < 0.001 for both). Loss of PV local capture by design was coincident with the development of PV entrance block and importantly predicted absence of acute reconnection during adenosine challenge with 96% positive predictive value (23% negative predictive value). Loss of PV local capture is common after antral PVI resulting in entrance block, and may be used as a specific alternate endpoint for PV electrical isolation. Additionally, loss of PV local capture may identify PVs at very low risk of acute reconnection during adenosine challenge. © 2014 Wiley Periodicals, Inc.
Density estimation using the trapping web design: A geometric analysis
Link, W.A.; Barker, R.J.
1994-01-01
Population densities for small mammal and arthropod populations can be estimated using capture frequencies for a web of traps. A conceptually simple geometric analysis that avoid the need to estimate a point on a density function is proposed. This analysis incorporates data from the outermost rings of traps, explaining large capture frequencies in these rings rather than truncating them from the analysis.
Membrane Process to Capture CO{sub 2} from Coal-Fired Power Plant Flue Gas
DOE Office of Scientific and Technical Information (OSTI.GOV)
Merkel, Tim; Wei, Xiaotong; Firat, Bilgen
2012-03-31
This final report describes work conducted for the U.S. Department of Energy National Energy Technology Laboratory (DOE NETL) on development of an efficient membrane process to capture carbon dioxide (CO{sub 2}) from power plant flue gas (award number DE-NT0005312). The primary goal of this research program was to demonstrate, in a field test, the ability of a membrane process to capture up to 90% of CO{sub 2} in coal-fired flue gas, and to evaluate the potential of a full-scale version of the process to perform this separation with less than a 35% increase in the levelized cost of electricity (LCOE).more » Membrane Technology and Research (MTR) conducted this project in collaboration with Arizona Public Services (APS), who hosted a membrane field test at their Cholla coal-fired power plant, and the Electric Power Research Institute (EPRI) and WorleyParsons (WP), who performed a comparative cost analysis of the proposed membrane CO{sub 2} capture process. The work conducted for this project included membrane and module development, slipstream testing of commercial-sized modules with natural gas and coal-fired flue gas, process design optimization, and a detailed systems and cost analysis of a membrane retrofit to a commercial power plant. The Polaris? membrane developed over a number of years by MTR represents a step-change improvement in CO{sub 2} permeance compared to previous commercial CO{sub 2}-selective membranes. During this project, membrane optimization work resulted in a further doubling of the CO{sub 2} permeance of Polaris membrane while maintaining the CO{sub 2}/N{sub 2} selectivity. This is an important accomplishment because increased CO{sub 2} permeance directly impacts the membrane skid cost and footprint: a doubling of CO{sub 2} permeance halves the skid cost and footprint. In addition to providing high CO{sub 2} permeance, flue gas CO{sub 2} capture membranes must be stable in the presence of contaminants including SO{sub 2}. Laboratory tests showed no degradation in Polaris membrane performance during two months of continuous operation in a simulated flue gas environment containing up to 1,000 ppm SO{sub 2}. A successful slipstream field test at the APS Cholla power plant was conducted with commercialsize Polaris modules during this project. This field test is the first demonstration of stable performance by commercial-sized membrane modules treating actual coal-fired power plant flue gas. Process design studies show that selective recycle of CO{sub 2} using a countercurrent membrane module with air as a sweep stream can double the concentration of CO{sub 2} in coal flue gas with little energy input. This pre-concentration of CO{sub 2} by the sweep membrane reduces the minimum energy of CO{sub 2} separation in the capture unit by up to 40% for coal flue gas. Variations of this design may be even more promising for CO{sub 2} capture from NGCC flue gas, in which the CO{sub 2} concentration can be increased from 4% to 20% by selective sweep recycle. EPRI and WP conducted a systems and cost analysis of a base case MTR membrane CO{sub 2} capture system retrofitted to the AEP Conesville Unit 5 boiler. Some of the key findings from this study and a sensitivity analysis performed by MTR include: The MTR membrane process can capture 90% of the CO{sub 2} in coal flue gas and produce high-purity CO{sub 2} (>99%) ready for sequestration. CO{sub 2} recycle to the boiler appears feasible with minimal impact on boiler performance; however, further study by a boiler OEM is recommended. For a membrane process built today using a combination of slight feed compression, permeate vacuum, and current compression equipment costs, the membrane capture process can be competitive with the base case MEA process at 90% CO{sub 2} capture from a coal-fired power plant. The incremental LCOE for the base case membrane process is about equal to that of a base case MEA process, within the uncertainty in the analysis. With advanced membranes (5,000 gpu for CO{sub 2} and 50 for CO{sub 2}/N{sub 2}), operating with no feed compression and low-cost CO{sub 2} compression equipment, an incremental LCOE of $33/MWh at 90% capture can be achieved (40% lower than the advanced MEA case). Even with lower cost compression, it appears unlikely that a membrane process using high feed compression (>5 bar) can be competitive with amine absorption, due to the capital cost and energy consumption of this equipment. Similarly, low vacuum pressure (<0.2 bar) cannot be used due to poor efficiency and high cost of this equipment. High membrane permeance is important to reduce the capital cost and footprint of the membrane unit. CO{sub 2}/N{sub 2} selectivity is less important because it is too costly to generate a pressure ratio where high selectivity can be useful. A potential cost ?sweet spot? exists for use of membrane-based technology, if 50-70% CO{sub 2} capture is acceptable. There is a minimum in the cost of CO{sub 2} avoided/ton that membranes can deliver at 60% CO{sub 2} capture, which is 20% lower than the cost at 90% capture. Membranes operating with no feed compression are best suited for lower capture rates. Currently, it appears that the biggest hurdle to use of membranes for post-combustion CO{sub 2} capture is compression equipment cost. An alternative approach is to use sweep membranes in parallel with another CO{sub 2} capture technology that does not require feed compression or vacuum equipment. Hybrid designs that utilize sweep membranes for selective CO{sub 2} recycle show potential to significantly reduce the minimum energy of CO{sub 2} separation.« less
Rahman, A.; Tsai, F.T.-C.; White, C.D.; Willson, C.S.
2008-01-01
This study investigates capture zone uncertainty that relates to the coupled semivariogram uncertainty of hydrogeological and geophysical data. Semivariogram uncertainty is represented by the uncertainty in structural parameters (range, sill, and nugget). We used the beta distribution function to derive the prior distributions of structural parameters. The probability distributions of structural parameters were further updated through the Bayesian approach with the Gaussian likelihood functions. Cokriging of noncollocated pumping test data and electrical resistivity data was conducted to better estimate hydraulic conductivity through autosemivariograms and pseudo-cross-semivariogram. Sensitivities of capture zone variability with respect to the spatial variability of hydraulic conductivity, porosity and aquifer thickness were analyzed using ANOVA. The proposed methodology was applied to the analysis of capture zone uncertainty at the Chicot aquifer in Southwestern Louisiana, where a regional groundwater flow model was developed. MODFLOW-MODPATH was adopted to delineate the capture zone. The ANOVA results showed that both capture zone area and compactness were sensitive to hydraulic conductivity variation. We concluded that the capture zone uncertainty due to the semivariogram uncertainty is much higher than that due to the kriging uncertainty for given semivariograms. In other words, the sole use of conditional variances of kriging may greatly underestimate the flow response uncertainty. Semivariogram uncertainty should also be taken into account in the uncertainty analysis. ?? 2008 ASCE.
How much work-related injury and illness is missed by the current national surveillance system?
Rosenman, Kenneth D; Kalush, Alice; Reilly, Mary Jo; Gardiner, Joseph C; Reeves, Mathew; Luo, Zhewui
2006-04-01
We sought to estimate the undercount in the existing national surveillance system of occupational injuries and illnesses. Adhering to the strict confidentiality rules of the U.S. Bureau of Labor Statistics, we matched the companies and individuals who reported work-related injuries and illnesses to the Bureau in 1999, 2000, and 2001 in Michigan with companies and individuals reported in four other Michigan data bases, workers' compensation, OSHA Annual Survey, OSHA Integrated Management Information System, and the Occupational Disease Report. We performed capture-recapture analysis to estimate the number of cases missed by the combined systems. We calculated that the current national surveillance system did not include 61% and with capture-recapture analysis up to 68% of the work-related injuries and illnesses that occurred annually in Michigan. This was true for injuries alone, 60% and 67%, and illnesses alone 66% and 69%, respectively. The current national system for work-related injuries and illnesses markedly underestimates the magnitude of these conditions. A more comprehensive system, such as the one developed for traumatic workplace fatalities, that is not solely dependent on employer based data sources is needed to better guide decision-making and evaluation of public health programs to reduce work-related conditions.
Cannell, John; Jovic, Emelyn; Rathjen, Amy; Lane, Kylie; Tyson, Anna M; Callisaya, Michele L; Smith, Stuart T; Ahuja, Kiran Dk; Bird, Marie-Louise
2018-02-01
To compare the efficacy of novel interactive, motion capture-rehabilitation software to usual care stroke rehabilitation on physical function. Randomized controlled clinical trial. Two subacute hospital rehabilitation units in Australia. In all, 73 people less than six months after stroke with reduced mobility and clinician determined capacity to improve. Both groups received functional retraining and individualized programs for up to an hour, on weekdays for 8-40 sessions (dose matched). For the intervention group, this individualized program used motivating virtual reality rehabilitation and novel gesture controlled interactive motion capture software. For usual care, the individualized program was delivered in a group class on one unit and by rehabilitation assistant 1:1 on the other. Primary outcome was standing balance (functional reach). Secondary outcomes were lateral reach, step test, sitting balance, arm function, and walking. Participants (mean 22 days post-stroke) attended mean 14 sessions. Both groups improved (mean (95% confidence interval)) on primary outcome functional reach (usual care 3.3 (0.6 to 5.9), intervention 4.1 (-3.0 to 5.0) cm) with no difference between groups ( P = 0.69) on this or any secondary measures. No differences between the rehabilitation units were seen except in lateral reach (less affected side) ( P = 0.04). No adverse events were recorded during therapy. Interactive, motion capture rehabilitation for inpatients post stroke produced functional improvements that were similar to those achieved by usual care stroke rehabilitation, safely delivered by either a physical therapist or a rehabilitation assistant.
ERIC Educational Resources Information Center
Finucane, Mariel McKenzie; Martinez, Ignacio; Cody, Scott
2018-01-01
In the coming years, public programs will capture even more and richer data than they do now, including data from web-based tools used by participants in employment services, from tablet-based educational curricula, and from electronic health records for Medicaid beneficiaries. Program evaluators seeking to take full advantage of these data…
ERIC Educational Resources Information Center
Rowling, Louise; Jeffreys, Vicki
2006-01-01
Despite the intersectoral nature of health promotion practice many programs limit their evidence base to health sector research and do not draw on evidence from other sectors' research in program design. To help ensure programs are relevant and acceptable to intersectoral partners and intended outcomes are of value to all sectors involved,…
Quantification of epithelial cells in coculture with fibroblasts by fluorescence image analysis.
Krtolica, Ana; Ortiz de Solorzano, Carlos; Lockett, Stephen; Campisi, Judith
2002-10-01
To demonstrate that senescent fibroblasts stimulate the proliferation and neoplastic transformation of premalignant epithelial cells (Krtolica et al.: Proc Natl Acad Sci USA 98:12072-12077, 2001), we developed methods to quantify the proliferation of epithelial cells cocultured with fibroblasts. We stained epithelial-fibroblast cocultures with the fluorescent DNA-intercalating dye 4,6-diamidino-2-phenylindole (DAPI), or expressed green fluorescent protein (GFP) in the epithelial cells, and then cultured them with fibroblasts. The cocultures were photographed under an inverted microscope with appropriate filters, and the fluorescent images were captured with a digital camera. We modified an image analysis program to selectively recognize the smaller, more intensely fluorescent epithelial cell nuclei in DAPI-stained cultures and used the program to quantify areas with DAPI fluorescence generated by epithelial nuclei or GFP fluorescence generated by epithelial cells in each field. Analysis of the image areas with DAPI and GFP fluorescences produced nearly identical quantification of epithelial cells in coculture with fibroblasts. We confirmed these results by manual counting. In addition, GFP labeling permitted kinetic studies of the same coculture over multiple time points. The image analysis-based quantification method we describe here is an easy and reliable way to monitor cells in coculture and should be useful for a variety of cell biological studies. Copyright 2002 Wiley-Liss, Inc.
Motivation for Evaluation: A roadmap for Improving Program Efficacy
NASA Astrophysics Data System (ADS)
Taber, J. J.; Bohon, W.; Bravo, T. K.; Dorr, P. M.; Hubenthal, M.; Johnson, J. A.; Sumy, D. F.; Welti, R.; Davis, H. B.
2016-12-01
Over the past year, the Incorporated Research Institutions for Seismology (IRIS) Education and Public Outreach (EPO) program has undertaken a new effort to increase the rigor with which it evaluates its programs and products. More specifically we sought to make evaluation an integral part of our EPO staff's work, enable staff to demonstrate why we do the activities we do, enhance the impact or our products and programs, and empower staff to be able to make evidence-based claims. The challenges we faced included a modest budget, finding an applicable approach to both new and legacy programs ranging from formal and informal education to public outreach, and implementing the process without overwhelming staff. The Collaborative Impact Analysis Method (IAM; Davis and Scalice, 2015) was selected as it allowed us to combine the EPO staff's knowledge of programs, audiences and content with the expertise of an outside evaluation expert, through consultations and a qualitative rubric assessing the initial state of each product/program's evaluation. Staff then developed action plans to make incremental improvements to the evaluation of programs over time. We have found that this approach promotes the development of staff knowledge and skills regarding evaluation, provides a common language among staff, increases enthusiasm to collect and share data, encourages discussions of evaluative approaches when planning new activities, and improves each program's ability to capture the intended and unintended effects on the behaviors, attitudes, skills, interests, and/or knowledge of users/participants. We will share the initial IAM Scores for products and programs in the EPO portfolio, along with examples of the action plans for several key products and programs, and the impact that implementing those actions plans has had on our evaluations. Davis, H. & Scalice, D. (2015). Evaluate the Impact of your Education and Outreach Program Using the Quantitative Collaborative Impact Analysis Method (Invited). Abstract ED53D-0871 presented at 2015 Fall Meeting, AGU, San Francisco, Calif., 14 - 18 Dec.
The Automated Instrumentation and Monitoring System (AIMS) reference manual
NASA Technical Reports Server (NTRS)
Yan, Jerry; Hontalas, Philip; Listgarten, Sherry
1993-01-01
Whether a researcher is designing the 'next parallel programming paradigm,' another 'scalable multiprocessor' or investigating resource allocation algorithms for multiprocessors, a facility that enables parallel program execution to be captured and displayed is invaluable. Careful analysis of execution traces can help computer designers and software architects to uncover system behavior and to take advantage of specific application characteristics and hardware features. A software tool kit that facilitates performance evaluation of parallel applications on multiprocessors is described. The Automated Instrumentation and Monitoring System (AIMS) has four major software components: a source code instrumentor which automatically inserts active event recorders into the program's source code before compilation; a run time performance-monitoring library, which collects performance data; a trace file animation and analysis tool kit which reconstructs program execution from the trace file; and a trace post-processor which compensate for data collection overhead. Besides being used as prototype for developing new techniques for instrumenting, monitoring, and visualizing parallel program execution, AIMS is also being incorporated into the run-time environments of various hardware test beds to evaluate their impact on user productivity. Currently, AIMS instrumentors accept FORTRAN and C parallel programs written for Intel's NX operating system on the iPSC family of multi computers. A run-time performance-monitoring library for the iPSC/860 is included in this release. We plan to release monitors for other platforms (such as PVM and TMC's CM-5) in the near future. Performance data collected can be graphically displayed on workstations (e.g. Sun Sparc and SGI) supporting X-Windows (in particular, Xl IR5, Motif 1.1.3).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mulder, John C.; Schwartz, Moses Daniel; Berg, Michael J.
2013-10-01
Critical infrastructures, such as electrical power plants and oil refineries, rely on programmable logic controllers (PLCs) to control essential processes. State of the art security cannot detect attacks on PLCs at the hardware or firmware level. This renders critical infrastructure control systems vulnerable to costly and dangerous attacks. WeaselBoard is a PLC backplane analysis system that connects directly to the PLC backplane to capture backplane communications between modules. WeaselBoard forwards inter-module traffic to an external analysis system that detects changes to process control settings, sensor values, module configuration information, firmware updates, and process control program (logic) updates. WeaselBoard provides zero-daymore » exploit detection for PLCs by detecting changes in the PLC and the process. This approach to PLC monitoring is protected under U.S. Patent Application 13/947,887.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jahnke, Fred C.
FuelCell Energy with support from the Department of Energy’s Office of Energy Efficiency and Renewable Energy (EERE) has investigated the production of low-cost, low CO2 hydrogen using a molten carbonate fuel cell operating as an electrolyzer. We confirmed the feasibility of the technology by testing a large-scale short stack. Economic analysis was done with the assistance of the National Fuel Cell Center at the University of California, Irvine and we found the technology to be attractive, especially for distributed hydrogen. We explored the performance under various operating parameters and developed an accurate model for further analysis and development calculations. Wemore » achieved the expected results, meeting all program goals. We identified additional uses of the technology such as for CO2 capture, power storage, and power load leveling.« less
Game Development as a Pathway to Information Technology Literacy
ERIC Educational Resources Information Center
Frydenberg, Mark
2016-01-01
Teaching game development has become an accepted methodology for introducing programming concepts and capturing the interest of beginning computer science and information technology (IT) students. This study, conducted over three consecutive semesters, explores game development using a gaming engine, rather than a traditional programming language,…
PRACTICAL APPLICATIONS FROM OBSERVATIONS OF MERCURY OXIDATION AND BINDING MECHANISMS
This paper describes a bench-scale program at the U.S. EPA. The goals of this program are to (a) isolate individual mechanisms of elemental mercury oxidation and oxidized mercury capture, (b) compete these mechanisms over a broad temperature range to determine which are dominant...
Link, William A; Barker, Richard J
2005-03-01
We present a hierarchical extension of the Cormack-Jolly-Seber (CJS) model for open population capture-recapture data. In addition to recaptures of marked animals, we model first captures of animals and losses on capture. The parameter set includes capture probabilities, survival rates, and birth rates. The survival rates and birth rates are treated as a random sample from a bivariate distribution, thus the model explicitly incorporates correlation in these demographic rates. A key feature of the model is that the likelihood function, which includes a CJS model factor, is expressed entirely in terms of identifiable parameters; losses on capture can be factored out of the model. Since the computational complexity of classical likelihood methods is prohibitive, we use Markov chain Monte Carlo in a Bayesian analysis. We describe an efficient candidate-generation scheme for Metropolis-Hastings sampling of CJS models and extensions. The procedure is illustrated using mark-recapture data for the moth Gonodontis bidentata.
Adlhoch, Cornelia; Kaiser, Marco; Hoehne, Marina; Mas Marques, Andreas; Stefas, Ilias; Veas, Francisco; Ellerbrok, Heinz
2011-02-10
The principle of a capture ELISA is binding of specific capture antibodies (polyclonal or monoclonal) to the surface of a suitable 96 well plate. These immobilized antibodies are capable of specifically binding a virus present in a clinical sample. Subsequently, the captured virus is detected using a specific detection antibody. The drawback of this method is that a capture ELISA can only function for a single virus captured by the primary antibody. Human Apolipoprotein H (ApoH) or β2-glycoprotein 1 is able to poly-specifically bind viral pathogens. Replacing specific capture antibodies by ApoH should allow poly-specific capture of different viruses that subsequently could be revealed using specific detection antibodies. Thus, using a single capture ELISA format different viruses could be analysed depending on the detection antibody that is applied. In order to demonstrate that this is a valid approach we show detection of group A rotaviruses from stool samples as a proof of principle for a new method of capture ELISA that should also be applicable to other viruses. Stool samples of different circulating common human and potentially zoonotic group A rotavirus strains, which were pretested in commercial EIAs and genotyped by PCR, were tested in parallel in an ApoH-ELISA set-up and by quantitative real-time PCR (qPCR). Several control samples were included in the analysis. The ApoH-ELISA was suitable for the capture of rotavirus-particles and the detection down to 1,000 infectious units (TCID(50/ml)). Subsets of diagnostic samples of different G- and P-types were tested positive in the ApoH-ELISA in different dilutions. Compared to the qPCR results, the analysis showed high sensitivity, specificity and low cross-reactivity for the ApoH-ELISA, which was confirmed in receiver operating characteristics (ROC) analysis. In this study the development of a highly sensitive and specific capture ELISA was demonstrated by combining a poly-specific ApoH capture step with specific detection antibodies using group A rotaviruses as an example.
NASA Astrophysics Data System (ADS)
Modine, Normand; Wright, Alan; Lee, Stephen
2015-03-01
Carrier recombination due to defects can have a major impact on device performance. The rate of defect-induced recombination is determined by both defect levels and carrier capture cross-sections. Density functional theory (DFT) has been widely and successfully used to predict defect levels, but only recently has work begun to focus on using DFT to determine carrier capture cross-sections. Lang and Henry worked out the fundamental theory of carrier-capture by multiphonon emission in the 1970s and showed that, above the Debye temperature, carrier-capture cross-sections differ between defects primarily due to differences in their carrier capture activation energies. We present an approach to using DFT to calculate carrier capture activation energies that does not depend on an assumed configuration coordinate and that fully accounts for anharmonic effects, which can substantially modify carrier activation energies. We demonstrate our approach for the -3/-2 level of the Ga vacancy in wurtzite GaN. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.
NASA Astrophysics Data System (ADS)
Modine, N. A.; Wright, A. F.; Lee, S. R.
The rate of defect-induced carrier recombination is determined by both defect levels and carrier capture cross-sections. Density functional theory (DFT) has been widely and successfully used to predict defect levels, but only recently has work begun to focus on using DFT to determine carrier capture cross-sections. Lang and Henry developed the theory of carrier-capture by multiphonon emission in the 1970s and showed that carrier-capture cross-sections differ between defects primarily due to differences in their carrier capture activation energies. We present an approach to using DFT to calculate carrier capture activation energies that does not depend on an assumed configuration coordinate and that fully accounts for anharmonic effects, which can substantially modify carrier activation energies. We demonstrate our approach for intrinisic defects in GaAs and GaN and discuss how our results depend on the choice of exchange-correlation functional and the treatment of spin polarization. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under Contract DE-AC04-94AL85000.
Progress Report Abstracts. Oceanic Biology Program.
1982-12-01
predominantly of myctophids and larval fishes of the genus Sebastolobus (having swimbladders). Our sampling efforts were concentrated on the 400m DSL...in collection of 7 individual animals of the genus , Nanomi sp. These animals were gently captured with the SGR and their oxygen consumption and...fishes of the genus , Sebastolobus altivelis, demersal fish as adults (Moser, 1974), were captured similarly at 600m with the SGR and their in situ
National Reconnaissance Almanac
2011-01-01
Germany. 1945 Mar. 19: The German V-2 program was abandoned, leaving rocket technology for capture by Allied forces. 1946 Apr. 16: The U.S. Army...first launched captured German V-2 rocket at White Sands, New Mexico during missile testing. May 2: RAND report, “Preliminary Design of Experimental...first human in space. May 5: LCDR Alan Shepard became first American in space during a brief sub-orbital flight. Aug. 30: USAF launched Corona
Trade Space Specification Tool (TSST) for Rapid Mission Architecture (Version 1.2)
NASA Technical Reports Server (NTRS)
Wang, Yeou-Fang; Schrock, Mitchell; Borden, Chester S.; Moeller, Robert C.
2013-01-01
Trade Space Specification Tool (TSST) is designed to capture quickly ideas in the early spacecraft and mission architecture design and categorize them into trade space dimensions and options for later analysis. It is implemented as an Eclipse RCP Application, which can be run as a standalone program. Users rapidly create concept items with single clicks on a graphical canvas, and can organize and create linkages between the ideas using drag-and-drop actions within the same graphical view. Various views such as a trade view, rules view, and architecture view are provided to help users to visualize the trade space. This software can identify, explore, and assess aspects of the mission trade space, as well as capture and organize linkages/dependencies between trade space components. The tool supports a user-in-the-loop preliminary logical examination and filtering of trade space options to help identify which paths in the trade space are feasible (and preferred) and what analyses need to be done later with executable models. This tool provides multiple user views of the trade space to guide the analyst/team to facilitate interpretation and communication of the trade space components and linkages, identify gaps in combining and selecting trade space options, and guide user decision-making for which combinations of architectural options should be pursued for further evaluation. This software provides an environment to capture mission trade space elements rapidly and assist users for their architecture analysis. This is primarily focused on mission and spacecraft architecture design, rather than general-purpose design application. In addition, it provides more flexibility to create concepts and organize the ideas. The software is developed as an Eclipse plug-in and potentially can be integrated with other Eclipse-based tools.
NASA Astrophysics Data System (ADS)
Antonopoulos, Chrissi Argyro
This study presents findings from survey and interview data investigating replication of green building measures by Commercial Building Partnership (CBP) partners that worked directly with the Pacific Northwest National Laboratory (PNNL). PNNL partnered directly with 12 organizations on new and retrofit construction projects, which represented approximately 28 percent of the entire U.S. Department of Energy (DOE) CBP program. Through a feedback survey mechanism, along with personal interviews, quantitative and qualitative data were gathered relating to replication efforts by each organization. These data were analyzed to provide insight into two primary research areas: 1) CBP partners' replication efforts of green building approaches used in the CBP project to the rest of the organization's building portfolio, and, 2) the market potential for technology diffusion into the total U.S. commercial building stock, as a direct result of the CBP program. The first area of this research focused specifically on replication efforts underway or planned by each CBP program participant. The second area of this research develops a diffusion of innovations model to analyze potential broad market impacts of the CBP program on the commercial building industry in the United States. Findings from this study provided insight into motivations and objectives CBP partners had for program participation. Factors that impact replication include motivation, organizational structure and objectives firms have for implementation of energy efficient technologies. Comparing these factors between different CBP partners revealed patterns in motivation for constructing energy efficient buildings, along with better insight into market trends for green building practices. The optimized approach to the CBP program allows partners to develop green building parameters that fit the specific uses of their building, resulting in greater motivation for replication. In addition, the diffusion model developed for this analysis indicates that this method of market prediction may be used to adequately capture cumulative construction metrics for a whole-building analysis as opposed to individual energy efficiency measures used in green building.
Dual Fan Separator within the Universal Waste Management System
NASA Technical Reports Server (NTRS)
Stapleton, Tom; Converse, Dave; Broyan, James Lee, Jr.
2014-01-01
Since NASA's new spacecraft in development for both LEO and Deep Space capability have considerable crew volume reduction in comparison to the Space Shuttle, the need became apparent for a smaller commode. In response the Universal Waste Management System (UWMS) was designed, resulting in an 80% volume reduction from the last US commode, while enhancing performance. The ISS WMS and previous shuttle commodes have a fan supplying air flow to capture feces and a separator to capture urine and separate air from the captured air/urine mixture. The UWMS combined both rotating equipment components into a single unit, referred to at the Dual Fan Separator (DFS). The combination of these components resulted in considerable packaging efficiency and weight reduction, removing inter-component plumbing, individual mounting configurations and required only a single motor and motor controller, in some of the intended UWMS platform applications the urine is pumped to the ISS Urine Processor Assembly (UPA) system. It requires the DFS to include less than 2.00% air inclusion, by volume, in the delivered urine. The rotational speed needs to be kept as low as possible in centrifugal urine separators to reduce air inclusion in the pumped fluid, while fans depend on rotational speed to develop delivered head. To satisfy these conflicting requirements, a gear reducer was included, allowing the fans to rotate at a much higher speed than the separator. This paper outlines the studies and analysis performed to develop the DFS configuration. The studies included a configuration trade study, dynamic stability analysis of the rotating bodies and a performance analysis of included labyrinth seals. NASA is considering a program to fly the UWMS aboard the ISS as a flight experiment. The goal of this activity is to advance the Technical Readiness Level (TRL) of the DFS and determine if the concept is ready to be included as part of the flight experiment deliverable.
Productivity Research and Development Planning Workshop.
1986-03-01
mission. These ideas must come from everyone in all echelons of the Comand. The MAC quality of worklife efforts consist of quality circles, labor-manage...that would not have been captured by the measure as initially formulated by the PIG. The balance of the plan and recommenda- tions made by the PMWG...people are doing things. We have the suggestion program, the quality of worklife program, and the Tech Mod program. All of these programs are productivity
Supercritical wing sections 2, volume 108
NASA Technical Reports Server (NTRS)
Bauer, F.; Garabedian, P.; Korn, D.; Jameson, A.; Beckmann, M. (Editor); Kuenzi, H. P. (Editor)
1975-01-01
A mathematical theory for the design and analysis of supercritical wing sections was previously presented. Examples and computer programs showing how this method works were included. The work on transonics is presented in a more definitive form. For design, a better model of the trailing edge is introduced which should eliminate a loss of fifteen or twenty percent in lift experienced with previous heavily aft loaded models, which is attributed to boundary layer separation. How drag creep can be reduced at off-design conditions is indicated. A rotated finite difference scheme is presented that enables the application of Murman's method of analysis in more or less arbitrary curvilinear coordinate systems. This allows the use of supersonic as well as subsonic free stream Mach numbers and to capture shock waves as far back on an airfoil as desired. Moreover, it leads to an effective three dimensional program for the computation of transonic flow past an oblique wing. In the case of two dimensional flow, the method is extended to take into account the displacement thickness computed by a semi-empirical turbulent boundary layer correction.
Lim, Sungwoo; Singh, Tejinder P; Hall, Gerod; Walters, Sarah; Gould, L Hannah
2018-03-12
To assess the impact of a New York City supportive housing program on housing stability and preventable emergency department (ED) visits/hospitalizations among heads of homeless families with mental and physical health conditions or substance use disorders. Multiple administrative data from New York City and New York State for 966 heads of families eligible for the program during 2007-12. We captured housing events and health care service utilization during 2 years prior to the first program eligibility date (baseline) and 2 years postbaseline. We performed sequence analysis to measure housing stability and compared housing stability and preventable ED visits and hospitalizations between program participants (treatment group) and eligible applicants not placed in the program (comparison group) via marginal structural modeling. We matched electronically collected data. Eighty-seven percent of supportive housing tenants experienced housing stability in 2 years postbaseline. Compared with unstably housed heads of families in the comparison group, those in the treatment group were 0.60 times as likely to make preventable ED visits postbaseline (95% CI = 0.38, 0.96). Supportive housing placement was associated with improved housing stability and reduced preventable health care visits among homeless families. © Health Research and Educational Trust.
MyPOD: an EMR-Based Tool that Facilitates Quality Improvement and Maintenance of Certification.
Berman, Loren; Duffy, Brian; Randall Brenn, B; Vinocur, Charles
2017-03-01
Maintenance of Certification (MOC) was designed to assess physician competencies including operative case volume and outcomes. This information, if collected consistently and systematically, can be used to facilitate quality improvement. Information automatically extracted from the electronic medical record (EMR) can be used as a prompt to compile these data. We developed an EMR-based program called MyPOD (My Personal Outcomes Data) to track surgical outcomes at our institution. We compared occurrences reported in the first 18 months to those captured in the American College of Surgeons National Surgical Quality Improvement Program-Pediatric (ACS NSQIP-P) over the same time period. During the first 18 months of using MyPOD, 691 cases were captured in both MyPOD and NSQIP-P. There were 48 cases with occurrences in NSQIP-P (6.9% occurrence rate). MyPOD captured 33% of the occurrences and 83% of the deaths reported in NSQIP-P. Use of the MyPOD program helped to identify series of complications and facilitated systematic change to improve outcomes. MyPOD provides comparative data that is essential in performance evaluation and facilitates quality improvement in surgery. This program and similar EMR-driven tools are becoming essential components of the MOC process. Our initial review has revealed opportunities for improvement in self-reporting which we can continue to measure by comparison to NSQIP-P. In addition, it has identified systems issues that have led to hospital-wide improvements.
Zhai, Haibo; Ou, Yang; Rubin, Edward S
2015-07-07
This study employs a power plant modeling tool to explore the feasibility of reducing unit-level emission rates of CO2 by 30% by retrofitting carbon capture, utilization, and storage (CCUS) to existing U.S. coal-fired electric generating units (EGUs). Our goal is to identify feasible EGUs and their key attributes. The results indicate that for about 60 gigawatts of the existing coal-fired capacity, the implementation of partial CO2 capture appears feasible, though its cost is highly dependent on the unit characteristics and fuel prices. Auxiliary gas-fired boilers can be employed to power a carbon capture process without significant increases in the cost of electricity generation. A complementary CO2 emission trading program can provide additional economic incentives for the deployment of CCS with 90% CO2 capture. Selling and utilizing the captured CO2 product for enhanced oil recovery can further accelerate CCUS deployment and also help reinforce a CO2 emission trading market. These efforts would allow existing coal-fired EGUs to continue to provide a significant share of the U.S. electricity demand.
NASA Astrophysics Data System (ADS)
Tornow, W.; Bhike, Megha
2015-05-01
A program is underway at the Triangle Universities Nuclear Laboratory (TUNL) to measure the neutron capture cross section in the 0.5 to 15 MeV energy range on nuclei whose radioactive daughters could potentially create backgrounds in searches for rare events. Here, we refer to neutrino-less double-beta decay and dark-matter searches, and to detectors built for neutrino and/or antineutrino studies. Neutron capture cross-section data obtained by using the activation method are reported for 40Ar, 74,76Ge, 128,130Te and 136Xe and compared to model calculations and evaluations.
NRG CO 2NCEPT - Confirmation Of Novel Cost-effective Emerging Post-combustion Technology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stevenson, Matthew; Armpriester, Anthony
Under DOE's solicitation DE-FOA-0001190, NRG and Inventys conceptualized a Large-Scale pilot (>10MWe) post-combustion CO 2 capture project using Inventys' VeloxoThermTM carbon capture technology. The technology is comprised of an intensified thermal swing adsorption (TSA) process that uses a patented architecture of structured adsorbent and a novel process design and embodiment to capture CO 2 from industrial flue gas streams. The result of this work concluded that the retrofit of this technology is economically and technically viable, but that the sorbent material selected for the program would need improving to meet the techno-economic performance requirements of the solicitation.
Failure of communication and capture: The perils of temporary unipolar pacing system.
Sahinoglu, Efe; Wool, Thomas J; Wool, Kenneth J
2015-06-01
We present a case of a patient with pacemaker dependence secondary to complete heart block who developed loss of capture of her temporary pacemaker. Patient developed torsades de pointes then ventricular fibrillation, requiring CPR and external cardioversion. After patient was stabilized, it was noticed that loss of capture of pacemaker corresponded with nursing care, when the pulse generator was lifted off patient׳s chest wall, and that patient׳s temporary pacing system had been programmed to unipolar mode without knowledge of attending cardiologist. This case highlights the importance of communication ensuring all caregivers are aware of mode of the temporary pacing system.
ERIC Educational Resources Information Center
Igami, Masatsura; Okazaki, Teruo
2007-01-01
This analysis aims at capturing current inventive activities in nanotechnologies based on the analysis of patent applications to the European Patent Office (EPO). Reported findings include: (1) Nanotechnology is a multifaceted technology, currently consisting of a set of technologies on the nanometre scale rather than a single technological field;…
Improving designer productivity. [artificial intelligence
NASA Technical Reports Server (NTRS)
Hill, Gary C.
1992-01-01
Designer and design team productivity improves with skill, experience, and the tools available. The design process involves numerous trials and errors, analyses, refinements, and addition of details. Computerized tools have greatly speeded the analysis, and now new theories and methods, emerging under the label Artificial Intelligence (AI), are being used to automate skill and experience. These tools improve designer productivity by capturing experience, emulating recognized skillful designers, and making the essence of complex programs easier to grasp. This paper outlines the aircraft design process in today's technology and business climate, presenting some of the challenges ahead and some of the promising AI methods for meeting these challenges.
Whyte, E F; Richter, C; O'Connor, S; Moran, K A
2018-02-01
Deficits in trunk control predict ACL injuries which frequently occur during high-risk activities such as cutting. However, no existing trunk control/core stability program has been found to positively affect trunk kinematics during cutting activities. This study investigated the effectiveness of a 6-week dynamic core stability program (DCS) on the biomechanics of anticipated and unanticipated side and crossover cutting maneuvers. Thirty-one male, varsity footballers participated in this randomized controlled trial. Three-dimensional trunk and lower limb biomechanics were captured in a motion analysis laboratory during the weight acceptance phase of anticipated and unanticipated side and crossover cutting maneuvers at baseline and 6-week follow-up. The DCS group performed a DCS program three times weekly for 6 weeks in a university rehabilitation room. Both the DCS and control groups concurrently completed their regular practice and match play. Statistical parametric mapping and repeated measures analysis of variance were used to determine any group (DCS vs control) by time (pre vs post) interactions. The DCS resulted in greater internal hip extensor (P=.017, η 2 =0.079), smaller internal knee valgus (P=.026, η 2 =0.076), and smaller internal knee external rotator moments (P=.041, η 2 =0.066) during anticipated side cutting compared with the control group. It also led to reduced posterior ground reaction forces for all cutting activities (P=.015-.030, η 2 =0.074-0.105). A 6-week DCS program did not affect trunk kinematics, but it did reduce a small number of biomechanical risk factors for ACL injury, predominantly during anticipated side cutting. A DCS program could play a role in multimodal ACL injury prevention programs. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
77 FR 5615 - Information Collection Activity; Request for Comments
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-03
...) 366-1930. SUPPLEMENTARY INFORMATION: Title: U.S. Department of Transportation Mentor- Prot[eacute]g[eacute] Pilot Program Evaluation Form; and U.S. Department of Transportation Mentor Prot[eacute]g[eacute] Pilot Program Annual Report. Abstract: DOT will use the data captured in the Mentor- Prot[eacute]g...
Teaching Perspectives among Introductory Computer Programming Faculty in Higher Education
ERIC Educational Resources Information Center
Mainier, Michael J.
2011-01-01
This study identified the teaching beliefs, intentions, and actions of 80 introductory computer programming (CS1) faculty members from institutions of higher education in the United States using the Teacher Perspectives Inventory. Instruction method used inside the classroom, categorized by ACM CS1 curriculum guidelines, was also captured along…
Literacy, Language and Social Interaction in Special Schools
ERIC Educational Resources Information Center
Reichenberg, Monica
2015-01-01
The present study is a follow up study to a quantitative intervention study where two intervention programs, Reciprocal Teaching and Inference Training, were practiced. This study aims at capturing the potentials benefits and qualitative aspects of one of the programs evaluated, Reciprocal Teaching. More specifically, I have investigated the video…
WHAEM: PROGRAM DOCUMENTATION FOR THE WELLHEAD ANALYTIC ELEMENT MODEL (EPA/600/SR-94/210)
A new computer program has been developed to determine time-of-travel capture zones in relatively simple geohydrological settings. The WhAEM package contains an analytic element model that uses superposition of (many) closed form analytical solutions to generate a groundwater flo...
17 CFR 38.552 - Elements of an acceptable audit trail program.
Code of Federal Regulations, 2014 CFR
2014-04-01
... of the order shall also be captured. (b) Transaction history database. A designated contract market's audit trail program must include an electronic transaction history database. An adequate transaction history database includes a history of all trades executed via open outcry or via entry into an electronic...
17 CFR 38.552 - Elements of an acceptable audit trail program.
Code of Federal Regulations, 2013 CFR
2013-04-01
... of the order shall also be captured. (b) Transaction history database. A designated contract market's audit trail program must include an electronic transaction history database. An adequate transaction history database includes a history of all trades executed via open outcry or via entry into an electronic...
Eddingsaas, Nathan; Pagano, Todd; Cummings, Cody; Rahman, Irfan; Robinson, Risa; Hensel, Edward
2018-02-13
This work investigates emissions sampling methods employed for qualitative identification of compounds in e-liquids and their resultant aerosols to assess what capture methods may be sufficient to identify harmful and potentially harmful constituents present. Three popular e-liquid flavors (cinnamon, mango, vanilla) were analyzed using qualitative gas chromatography-mass spectrometry (GC-MS) in the un-puffed state. Each liquid was also machine-puffed under realistic-use flow rate conditions and emissions were captured using two techniques: filter pads and methanol impingers. GC-MS analysis was conducted on the emissions captured using both techniques from all three e-liquids. The e-liquid GC-MS analysis resulted in positive identification of 13 compounds from the cinnamon flavor e-liquid, 31 from mango, and 19 from vanilla, including a number of compounds observed in all e-liquid experiments. Nineteen compounds were observed in emissions which were not present in the un-puffed e-liquid. Qualitative GC-MS analysis of the emissions samples identify compounds observed in all three samples: e-liquid, impinge, and filter pads, and each subset thereof. A limited number of compounds were observed in emissions captured with impingers, but were not observed in emissions captured using filter pads; a larger number of compounds were observed on emissions collected from the filter pads, but not those captured with impingers. It is demonstrated that sampling methods have different sampling efficiencies and some compounds might be missed using only one method. It is recommended to investigate filter pads, impingers, thermal desorption tubes, and solvent extraction resins to establish robust sampling methods for emissions testing of e-cigarette emissions.
Eddingsaas, Nathan; Pagano, Todd; Cummings, Cody; Rahman, Irfan; Robinson, Risa
2018-01-01
This work investigates emissions sampling methods employed for qualitative identification of compounds in e-liquids and their resultant aerosols to assess what capture methods may be sufficient to identify harmful and potentially harmful constituents present. Three popular e-liquid flavors (cinnamon, mango, vanilla) were analyzed using qualitative gas chromatography-mass spectrometry (GC-MS) in the un-puffed state. Each liquid was also machine-puffed under realistic-use flow rate conditions and emissions were captured using two techniques: filter pads and methanol impingers. GC-MS analysis was conducted on the emissions captured using both techniques from all three e-liquids. The e-liquid GC-MS analysis resulted in positive identification of 13 compounds from the cinnamon flavor e-liquid, 31 from mango, and 19 from vanilla, including a number of compounds observed in all e-liquid experiments. Nineteen compounds were observed in emissions which were not present in the un-puffed e-liquid. Qualitative GC-MS analysis of the emissions samples identify compounds observed in all three samples: e-liquid, impinge, and filter pads, and each subset thereof. A limited number of compounds were observed in emissions captured with impingers, but were not observed in emissions captured using filter pads; a larger number of compounds were observed on emissions collected from the filter pads, but not those captured with impingers. It is demonstrated that sampling methods have different sampling efficiencies and some compounds might be missed using only one method. It is recommended to investigate filter pads, impingers, thermal desorption tubes, and solvent extraction resins to establish robust sampling methods for emissions testing of e-cigarette emissions. PMID:29438289
Laser Capture Microdissection for Protein and NanoString RNA analysis
Golubeva, Yelena; Salcedo, Rosalba; Mueller, Claudius; Liotta, Lance A.; Espina, Virginia
2013-01-01
Laser capture microdissection (LCM) allows the precise procurement of enriched cell populations from a heterogeneous tissue, or live cell culture, under direct microscopic visualization. Histologically enriched cell populations can be procured by harvesting cells of interest directly, or isolating specific cells by ablating unwanted cells. The basic components of laser microdissection technology are a) visualization of cells via light microscopy, b) transfer of laser energy to a thermolabile polymer with either the formation of a polymer-cell composite (capture method) or transfer of laser energy via an ultraviolet laser to photovolatize a region of tissue (cutting method), and c) removal of cells of interest from the heterogeneous tissue section. The capture and cutting methods (instruments) for laser microdissection differ in the manner by which cells of interest are removed from the heterogeneous sample. Laser energy in the capture method is infrared (810nm), while in the cutting mode the laser is ultraviolet (355nm). Infrared lasers melt a thermolabile polymer that adheres to the cells of interest, whereas ultraviolet lasers ablate cells for either removal of unwanted cells or excision of a defined area of cells. LCM technology is applicable to an array of applications including mass spectrometry, DNA genotyping and loss-of-heterozygosity analysis, RNA transcript profiling, cDNA library generation, proteomics discovery, and signal kinase pathway profiling. This chapter describes laser capture microdissection using an ArcturusXT instrument for protein LCM sample analysis, and using a mmi CellCut Plus® instrument for RNA analysis via NanoString technology. PMID:23027006
Seamans, David P; Louka, Boshra F; Fortuin, F David; Patel, Bhavesh M; Sweeney, John P; Lanza, Louis A; DeValeria, Patrick A; Ezrre, Kim M; Ramakrishna, Harish
2016-10-01
The surgical and procedural specialties are continually evolving their methods to include more complex and technically difficult cases. These cases can be longer and incorporate multiple teams in a different model of operating room synergy. Patients are frequently older, with comorbidities adding to the complexity of these cases. Recording of this environment has become more feasible recently with advancement in video and audio capture systems often used in the simulation realm. We began using live capture to record a new procedure shortly after starting these cases in our institution. This has provided continued assessment and evaluation of live procedures. The goal of this was to improve human factors and situational challenges by review and debriefing. B-Line Medical's LiveCapture video system was used to record successive transcatheter aortic valve replacement (TAVR) procedures in our cardiac catheterization/laboratory. An illustrative case is used to discuss analysis and debriefing of the case using this system. An illustrative case is presented that resulted in long-term changes to our approach of these cases. The video capture documented rare events during one of our TAVR procedures. Analysis and debriefing led to definitive changes in our practice. While there are hurdles to the use of this technology in every institution, the role for the ongoing use of video capture, analysis, and debriefing may play an important role in the future of patient safety and human factors analysis in the operating environment.
Seamans, David P.; Louka, Boshra F.; Fortuin, F. David; Patel, Bhavesh M.; Sweeney, John P.; Lanza, Louis A.; DeValeria, Patrick A.; Ezrre, Kim M.; Ramakrishna, Harish
2016-01-01
Background: The surgical and procedural specialties are continually evolving their methods to include more complex and technically difficult cases. These cases can be longer and incorporate multiple teams in a different model of operating room synergy. Patients are frequently older, with comorbidities adding to the complexity of these cases. Recording of this environment has become more feasible recently with advancement in video and audio capture systems often used in the simulation realm. Aims: We began using live capture to record a new procedure shortly after starting these cases in our institution. This has provided continued assessment and evaluation of live procedures. The goal of this was to improve human factors and situational challenges by review and debriefing. Setting and Design: B-Line Medical's LiveCapture video system was used to record successive transcatheter aortic valve replacement (TAVR) procedures in our cardiac catheterization/laboratory. An illustrative case is used to discuss analysis and debriefing of the case using this system. Results and Conclusions: An illustrative case is presented that resulted in long-term changes to our approach of these cases. The video capture documented rare events during one of our TAVR procedures. Analysis and debriefing led to definitive changes in our practice. While there are hurdles to the use of this technology in every institution, the role for the ongoing use of video capture, analysis, and debriefing may play an important role in the future of patient safety and human factors analysis in the operating environment. PMID:27762242
Economic and energetic analysis of capturing CO2 from ambient air
House, Kurt Zenz; Baclig, Antonio C.; Ranjan, Manya; van Nierop, Ernst A.; Wilcox, Jennifer; Herzog, Howard J.
2011-01-01
Capturing carbon dioxide from the atmosphere (“air capture”) in an industrial process has been proposed as an option for stabilizing global CO2 concentrations. Published analyses suggest these air capture systems may cost a few hundred dollars per tonne of CO2, making it cost competitive with mainstream CO2 mitigation options like renewable energy, nuclear power, and carbon dioxide capture and storage from large CO2 emitting point sources. We investigate the thermodynamic efficiencies of commercial separation systems as well as trace gas removal systems to better understand and constrain the energy requirements and costs of these air capture systems. Our empirical analyses of operating commercial processes suggest that the energetic and financial costs of capturing CO2 from the air are likely to have been underestimated. Specifically, our analysis of existing gas separation systems suggests that, unless air capture significantly outperforms these systems, it is likely to require more than 400 kJ of work per mole of CO2, requiring it to be powered by CO2-neutral power sources in order to be CO2 negative. We estimate that total system costs of an air capture system will be on the order of $1,000 per tonne of CO2, based on experience with as-built large-scale trace gas removal systems. PMID:22143760
Growing Nurse Leaders: Their Perspectives on Nursing Leadership and Today’s Practice Environment
Dyess, Susan M; Sherman, Rose O; Pratt, Beth A; Chiang-Hanisko, Lenny
2016-01-14
With the growing complexity of healthcare practice environments and pending nurse leader retirements, the development of future nurse leaders is increasingly important. This article reports on focus group research conducted with Generation Y nurses prior to their initiating coursework in a Master’s Degree program designed to support development of future nurse leaders. Forty-four emerging nurse leaders across three program cohorts participated in this qualitative study conducted to capture perspectives about nursing leaders and leadership. Conventional content analysis was used to analyze and code the data into categories. We discuss the three major categories identified, including: idealistic expectations of leaders, leading in a challenging practice environment, and cautious but optimistic outlook about their own leadership and future, and study limitations. The conclusion offers implications for future nurse leader development. The findings provide important insight into the viewpoints of nurses today about leaders and leadership.
Improving spatial perception in 5-yr.-old Spanish children.
Jiménez, Andrés Canto; Sicilia, Antonio Oña; Vera, Juan Granda
2007-06-01
Assimilation of distance perception was studied in 70 Spanish primary school children. This assimilation involves the generation of projective images which are acquired through two mechanisms. One mechanism is spatial perception, wherein perceptual processes develop ensuring successful immersion in space and the acquisition of visual cues which a person may use to interpret images seen in the distance. The other mechanism is movement through space so that these images are produced. The present study evaluated the influence on improvements in spatial perception of using increasingly larger spaces for training sessions within a motor skills program. Visual parameters were measured in relation to the capture and tracking of moving objects or ocular motility and speed of detection or visual reaction time. Analysis showed that for the group trained in increasingly larger spaces, ocular motility and visual reaction time were significantly improved during. different phases of the program.
Exact and Approximate Probabilistic Symbolic Execution
NASA Technical Reports Server (NTRS)
Luckow, Kasper; Pasareanu, Corina S.; Dwyer, Matthew B.; Filieri, Antonio; Visser, Willem
2014-01-01
Probabilistic software analysis seeks to quantify the likelihood of reaching a target event under uncertain environments. Recent approaches compute probabilities of execution paths using symbolic execution, but do not support nondeterminism. Nondeterminism arises naturally when no suitable probabilistic model can capture a program behavior, e.g., for multithreading or distributed systems. In this work, we propose a technique, based on symbolic execution, to synthesize schedulers that resolve nondeterminism to maximize the probability of reaching a target event. To scale to large systems, we also introduce approximate algorithms to search for good schedulers, speeding up established random sampling and reinforcement learning results through the quantification of path probabilities based on symbolic execution. We implemented the techniques in Symbolic PathFinder and evaluated them on nondeterministic Java programs. We show that our algorithms significantly improve upon a state-of- the-art statistical model checking algorithm, originally developed for Markov Decision Processes.
Texture segmentation by genetic programming.
Song, Andy; Ciesielski, Vic
2008-01-01
This paper describes a texture segmentation method using genetic programming (GP), which is one of the most powerful evolutionary computation algorithms. By choosing an appropriate representation texture, classifiers can be evolved without computing texture features. Due to the absence of time-consuming feature extraction, the evolved classifiers enable the development of the proposed texture segmentation algorithm. This GP based method can achieve a segmentation speed that is significantly higher than that of conventional methods. This method does not require a human expert to manually construct models for texture feature extraction. In an analysis of the evolved classifiers, it can be seen that these GP classifiers are not arbitrary. Certain textural regularities are captured by these classifiers to discriminate different textures. GP has been shown in this study as a feasible and a powerful approach for texture classification and segmentation, which are generally considered as complex vision tasks.
Xu, Yihua; Pitot, Henry C
2006-03-01
In the studies of quantitative stereology of rat hepatocarcinogenesis, we have used image analysis technology (automatic particle analysis) to obtain data such as liver tissue area, size and location of altered hepatic focal lesions (AHF), and nuclei counts. These data are then used for three-dimensional estimation of AHF occurrence and nuclear labeling index analysis. These are important parameters for quantitative studies of carcinogenesis, for screening and classifying carcinogens, and for risk estimation. To take such measurements, structures or cells of interest should be separated from the other components based on the difference of color and density. Common background problems seen on the captured sample image such as uneven light illumination or color shading can cause severe problems in the measurement. Two application programs (BK_Correction and Pixel_Separator) have been developed to solve these problems. With BK_Correction, common background problems such as incorrect color temperature setting, color shading, and uneven light illumination background, can be corrected. With Pixel_Separator different types of objects can be separated from each other in relation to their color, such as seen with different colors in immunohistochemically stained slides. The resultant images of such objects separated from other components are then ready for particle analysis. Objects that have the same darkness but different colors can be accurately differentiated in a grayscale image analysis system after application of these programs.
Hilton, Lara; Elfenbaum, Pamela; Jain, Shamini; Sprengel, Meredith; Jonas, Wayne B
2018-03-01
The evaluation of freestanding integrative cancer clinical programs is challenging and is rarely done. We have developed an approach called the Claim Assessment Profile (CAP) to identify whether evaluation of a practice is justified, feasible, and likely to provide useful information. A CAP was performed in order to (1) clarify the healing claims at InspireHealth, an integrative oncology treatment program, by defining the most important impacts on its clients; (2) gather information about current research capacity at the clinic; and (3) create a program theory and path model for use in prospective research. This case study design incorporates methods from a variety of rapid assessment approaches. Procedures included site visits to observe the program, structured qualitative interviews with 26 providers and staff, surveys to capture descriptive data about the program, and observational data on program implementation. The InspireHealth program is a well-established, multi-site, thriving integrative oncology clinical practice that focuses on patient support, motivation, and health behavior engagement. It delivers patient-centered care via a standardized treatment protocol. There arehigh levels of research interest from staff and resources by which to conduct research. This analysis provides the primary descriptive and claims clarification of an integrative oncology treatment program, an evaluation readiness report, a detailed logic model explicating program theory, and a clinical outcomes path model for conducting prospective research. Prospective evaluation of this program would be feasible and valuable, adding to our knowledge base of integrative cancer therapies.
ERIC Educational Resources Information Center
Jones, Lawrence; Graham, Ian
1986-01-01
Reviews the main principles of interfacing and discusses the software developed to perform kinetic data capture and analysis with a BBC microcomputer linked to a recording spectrophotometer. Focuses on the steps in software development. Includes results of a lactate dehydrogenase assay. (ML)
Quantitative analysis of arm movement smoothness
NASA Astrophysics Data System (ADS)
Szczesna, Agnieszka; Błaszczyszyn, Monika
2017-07-01
The paper deals with the problem of motion data quantitative smoothness analysis. We investigated values of movement unit, fluidity and jerk for healthy and paralyzed arm of patients with hemiparesis after stroke. Patients were performing drinking task. To validate the approach, movement of 24 patients were captured using optical motion capture system.
Capture and fission with DANCE and NEUANCE
Jandel, M.; Baramsai, B.; Bond, E.; ...
2015-12-23
A summary of the current and future experimental program at DANCE is presented. Measurements of neutron capture cross sections are planned for many actinide isotopes with the goal to reduce the present uncertainties in nuclear data libraries. Detailed studies of capture gamma rays in the neutron resonance region will be performed in order to derive correlated data on the de-excitation of the compound nucleus. New approaches on how to remove the DANCE detector response from experimental data and retain the correlations between the cascade gamma rays are presented. Studies on 235U are focused on quantifying the population of short-lived isomericmore » states in 236U after neutron capture. For this purpose, a new neutron detector array NEUANCE is under construction. It will be installed in the central cavity of the DANCE array and enable the highly efficient tagging of fission and capture events. In addition, developments of fission fragment detectors are also underway to expand DANCE capabilities to measurements of fully correlated data on fission observables.« less
An overview of DANCE: a 4II BaF[2] detector for neutron capture measurements at LANSCE.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ullmann, J. L.
2004-01-01
The Detector for Advanced Neutron Capture experiments (DANCE) is a 162-element, 4{pi} BaF{sub 2} array designed to make neutron capture cross-section measurements on rare or radioactive targets with masses as little as 1 mg. Accurate capture cross sections are needed in many research areas, including stellar nucleosynthesis, advanced nuclear fuel cycles, waste transmutation, and other applied programs. These cross sections are difficult to calculate accurately and must be measured. Up to now, except for a few long-lived nuclides there are essentially no differential capture measurements on radioactive nuclei. The DANCE array is located at the Lujan Neutron Scattering Center atmore » LANSCE, which is a continuous-spectrum neutron source with useable energies from below thermal to about 100 keV. Data acquisition is done with 320 fast waveform digitizers. The design and initial performance results, including background minimization, will be discussed.« less
Current and Future Research at DANCE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jandel, M.; Baramsai, B.; Bredeweg, T. A.
2015-05-28
An overview of the current experimental program on measurements of neutron capture and neutron induced fission at the Detector for Advanced Neutron Capture Experiments (DANCE) is presented. Three major projects are currently under way: 1) high precision measurements of neutron capture cross sections on Uranium isotopes, 2) research aimed at studies of the short-lived actinide isomer production in neutron capture on 235U and 3) measurements of correlated data of fission observables. New projects include developments of auxiliary detectors to improve the capability of DANCE. We are building a compact, segmented NEUtron detector Array at DANCE (NEUANCE), which will be installedmore » in the central cavity of the DANCE array. It will thus provide experimental information on prompt fission neutrons in coincidence with the prompt fission gamma-rays measured by 160 BaF 2 crystals of DANCE. Additionally, unique correlated data will be obtained for neutron capture and neutron-induced fission using the DANCE-NEUANCE experimental set up in the future.« less
Capture and fission with DANCE and NEUANCE
NASA Astrophysics Data System (ADS)
Jandel, M.; Baramsai, B.; Bond, E.; Rusev, G.; Walker, C.; Bredeweg, T. A.; Chadwick, M. B.; Couture, A.; Fowler, M. M.; Hayes, A.; Kawano, T.; Mosby, S.; Stetcu, I.; Taddeucci, T. N.; Talou, P.; Ullmann, J. L.; Vieira, D. J.; Wilhelmy, J. B.
2015-12-01
A summary of the current and future experimental program at DANCE is presented. Measurements of neutron capture cross sections are planned for many actinide isotopes with the goal to reduce the present uncertainties in nuclear data libraries. Detailed studies of capture gamma rays in the neutron resonance region will be performed in order to derive correlated data on the de-excitation of the compound nucleus. New approaches on how to remove the DANCE detector response from experimental data and retain the correlations between the cascade gamma rays are presented. Studies on 235U are focused on quantifying the population of short-lived isomeric states in 236U after neutron capture. For this purpose, a new neutron detector array NEUANCE is under construction. It will be installed in the central cavity of the DANCE array and enable the highly efficient tagging of fission and capture events. In addition, developments of fission fragment detectors are also underway to expand DANCE capabilities to measurements of fully correlated data on fission observables.
Lee, Hun Joo; Cho, Hyeon-Yeol; Oh, Jin Ho; Namkoong, Kak; Lee, Jeong Gun; Park, Jong-Myeon; Lee, Soo Suk; Huh, Nam; Choi, Jeong-Woo
2013-09-15
Using hybrid nanoparticles (HNPs), we demonstrate simultaneous capture, in situ protein expression analysis, and cellular phenotype identification of circulating tumor cells (CTCs). Each HNP consists of three parts: (i) antibodies that bind specifically to a known biomarker for CTCs, (ii) a quantum dot that emits fluorescence signals, and (iii) biotinylated DNA that allows capture and release of CTC-HNP complex to an in-house developed capture & recovery chip (CRC). To evaluate our approach, cells representative of different breast cancer subtypes (MCF-7: luminal; SK-BR-3: HER2; and MDA-MB-231: basal-like) were captured onto CRC and expressions of EpCAM, HER2, and EGFR were detected concurrently. The average capture efficiency of CTCs was 87.5% with identification accuracy of 92.4%. Subsequently, by cleaving the DNA portion with restriction enzymes, captured cells were released at efficiencies of 86.1%. Further studies showed that these recovered cells are viable and can proliferate in vitro. Using HNPs, it is possible to count, analyze in situ protein expression, and culture CTCs, all from the same set of cells, enabling a wide range of molecular- and cellular-based studies using CTCs. Copyright © 2013 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ryan Hruska
Currently, small Unmanned Aerial Vehicles (UAVs) are primarily used for capturing and down-linking real-time video. To date, their role as a low-cost airborne platform for capturing high-resolution, georeferenced still imagery has not been fully utilized. On-going work within the Unmanned Vehicle Systems Program at the Idaho National Laboratory (INL) is attempting to exploit this small UAV-acquired, still imagery potential. Initially, a UAV-based still imagery work flow model was developed that includes initial UAV mission planning, sensor selection, UAV/sensor integration, and imagery collection, processing, and analysis. Components to support each stage of the work flow are also being developed. Critical tomore » use of acquired still imagery is the ability to detect changes between images of the same area over time. To enhance the analysts’ change detection ability, a UAV-specific, GIS-based change detection system called SADI or System for Analyzing Differences in Imagery is under development. This paper will discuss the associated challenges and approaches to collecting still imagery with small UAVs. Additionally, specific components of the developed work flow system will be described and graphically illustrated using varied examples of small UAV-acquired still imagery.« less
Foreman, William T.; Connor, Brooke F.; Furlong, Edward T.; Vaught, Deborah G.; Merten, Leslie M.
1995-01-01
A method for the determination of 30 individual organochlorine pesticides, total toxaphene, and total polychlorinated biphenyls (PCBs) in bottom sediment is described. The method isolates the pesticides and PCBs by solvent extraction with dichlorobenzene, removes inorganic sulfur, large naturally occurring molecules, and other unwanted interferences by gel permeation chromatography, and further cleans up and class fractionates the extract using adsorption chromatography. The com- pounds then are instrumentally determined using dual capillary-column gas chromatography with electron-capture detection. Reporting limits range from 1 to 5 micrograms per kilogram for 30 individual pesticides, 50 micrograms per kilogram for total PCBs, and 200 micrograms per kilogram for total toxaphene. The method also is designed to allow the simultaneous isolation of 79 other semivolatile organic compounds from the sediment, which are separately quantified using gas chromatography with mass spectrometric detection. The method was developed in support of the U.S. Geological Survey's National Water-Quality Assessment program.
Aerocapture Performance Analysis of A Venus Exploration Mission
NASA Technical Reports Server (NTRS)
Starr, Brett R.; Westhelle, Carlos H.
2005-01-01
A performance analysis of a Discovery Class Venus Exploration Mission in which aerocapture is used to capture a spacecraft into a 300km polar orbit for a two year science mission has been conducted to quantify its performance. A preliminary performance assessment determined that a high heritage 70 sphere-cone rigid aeroshell with a 0.25 lift to drag ratio has adequate control authority to provide an entry flight path angle corridor large enough for the mission s aerocapture maneuver. A 114 kilograms per square meter ballistic coefficient reference vehicle was developed from the science requirements and the preliminary assessment s heating indicators and deceleration loads. Performance analyses were conducted for the reference vehicle and for sensitivity studies on vehicle ballistic coefficient and maximum bank rate. The performance analyses used a high fidelity flight simulation within a Monte Carlo executive to define the aerocapture heating environment and deceleration loads and to determine mission success statistics. The simulation utilized the Program to Optimize Simulated Trajectories (POST) that was modified to include Venus specific atmospheric and planet models, aerodynamic characteristics, and interplanetary trajectory models. In addition to Venus specific models, an autonomous guidance system, HYPAS, and a pseudo flight controller were incorporated in the simulation. The Monte Carlo analyses incorporated a reference set of approach trajectory delivery errors, aerodynamic uncertainties, and atmospheric density variations. The reference performance analysis determined the reference vehicle achieves 100% successful capture and has a 99.87% probability of attaining the science orbit with a 90 meters per second delta V budget for post aerocapture orbital adjustments. A ballistic coefficient trade study conducted with reference uncertainties determined that the 0.25 L/D vehicle can achieve 100% successful capture with a ballistic coefficient of 228 kilograms per square meter and that the increased ballistic coefficient increases post aerocapture V budget to 134 meters per second for a 99.87% probability of attaining the science orbit. A trade study on vehicle bank rate determined that the 0.25 L/D vehicle can achieve 100% successful capture when the maximum bank rate is decreased from 30 deg/s to 20 deg/s. The decreased bank rate increases post aerocapture delta V budget to 102 meters per second for a 99.87% probability of attaining the science orbit.
JView Visualization for Next Generation Air Transportation System
2011-01-01
hardware graphics acceleration. JView relies on concrete Object Oriented Design (OOD) and programming techniques to provide a robust and venue non...visibility priority of a texture set. A good example of this is you have translucent images that should always be visible over the other textures...elements present in the scene. • Capture Alpha. Allows the alpha color channel ( translucency ) to be saved when capturing images or movies of a 3D scene
Automated rendezvous and capture development infrastructure
NASA Technical Reports Server (NTRS)
Bryan, Thomas C.; Roe, Fred; Coker, Cynthia
1992-01-01
The facilities at Marshall Space Flight Center and JSC to be utilized to develop and test an autonomous rendezvous and capture (ARC) system are described. This includes equipment and personnel facility capabilities to devise, develop, qualify, and integrate ARC elements and subsystems into flight programs. Attention is given to the use of a LEO test facility, the current concept and unique system elements of the ARC, and the options available to develop ARC technology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nelson, J; Christianson, O; Samei, E
Purpose: Flood-field uniformity evaluation is an essential element in the assessment of nuclear medicine (NM) gamma cameras. It serves as the central element of the quality control (QC) program, acquired and analyzed on a daily basis prior to clinical imaging. Uniformity images are traditionally analyzed using pixel value-based metrics which often fail to capture subtle structure and patterns caused by changes in gamma camera performance requiring additional visual inspection which is subjective and time demanding. The goal of this project was to develop and implement a robust QC metrology for NM that is effective in identifying non-uniformity issues, reporting issuesmore » in a timely manner for efficient correction prior to clinical involvement, all incorporated into an automated effortless workflow, and to characterize the program over a two year period. Methods: A new quantitative uniformity analysis metric was developed based on 2D noise power spectrum metrology and confirmed based on expert observer visual analysis. The metric, termed Structured Noise Index (SNI) was then integrated into an automated program to analyze, archive, and report on daily NM QC uniformity images. The effectiveness of the program was evaluated over a period of 2 years. Results: The SNI metric successfully identified visually apparent non-uniformities overlooked by the pixel valuebased analysis methods. Implementation of the program has resulted in nonuniformity identification in about 12% of daily flood images. In addition, due to the vigilance of staff response, the percentage of days exceeding trigger value shows a decline over time. Conclusion: The SNI provides a robust quantification of the NM performance of gamma camera uniformity. It operates seamlessly across a fleet of multiple camera models. The automated process provides effective workflow within the NM spectra between physicist, technologist, and clinical engineer. The reliability of this process has made it the preferred platform for NM uniformity analysis.« less
NASA Technical Reports Server (NTRS)
Chandler, Michael
2010-01-01
As the Space Shuttle Program comes to an end, it is important that the lessons learned from the Columbia accident be captured and understood by those who will be developing future aerospace programs and supporting current programs. Aeromedical lessons learned from the Accident were presented at AsMA in 2005. This Panel will update that information, closeout the lessons learned, provide additional information on the accident and provide suggestions for the future. To set the stage, an overview of the accident is required. The Space Shuttle Columbia was returning to Earth with a crew of seven astronauts on 1Feb, 2003. It disintegrated along a track extending from California to Louisiana and observers along part of the track filmed the breakup of Columbia. Debris was recovered from Littlefield, Texas to Fort Polk, Louisiana, along a 567 statute mile track; the largest ever recorded debris field. The Columbia Accident Investigation Board (CAIB) concluded its investigation in August 2003, and released their findings in a report published in February 2004. NASA recognized the importance of capturing the lessons learned from the loss of Columbia and her crew and the Space Shuttle Program managers commissioned the Spacecraft Crew Survival Integrated Investigation Team (SCSIIT) to accomplish this. Their task was to perform a comprehensive analysis of the accident, focusing on factors and events affecting crew survival, and to develop recommendations for improving crew survival, including the design features, equipment, training and procedures intended to protect the crew. NASA released the Columbia Crew Survival Investigation Report in December 2008. Key personnel have been assembled to give you an overview of the Space Shuttle Columbia accident, the medical response, the medico-legal issues, the SCSIIT findings and recommendations and future NASA flight surgeon spacecraft accident response training. Educational Objectives: Set the stage for the Panel to address the investigation, medico-legal issues, the Spacecraft Crew Survival Integrated Investigation Team report and training for accident response.
SU-E-P-05: Electronic Brachytherapy: A Physics Perspective On Field Implementation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pai, S; Ayyalasomayajula, S; Lee, S
2015-06-15
Purpose: We want to summarize our experience implementing a successful program of electronic brachytherapy at several dermatology clinics with the help of a cloud based software to help us define the key program parameters and capture physics QA aspects. Optimally developed software helps the physicist in peer review and qualify the physical parameters. Methods: Using the XOFT™ Axxent™ electronic brachytherapy system in conjunction with a cloud-based software, a process was setup to capture and record treatments. It was implemented initially at about 10 sites in California. For dosimetric purposes, the software facilitated storage of the physics parameters of surface applicatorsmore » used in treatment and other source calibration parameters. In addition, the patient prescription, pathology and other setup considerations were input by radiation oncologist and the therapist. This facilitated physics planning of the treatment parameters and also independent check of the dwell time. From 2013–2014, nearly1500 such calculation were completed by a group of physicists. A total of 800 patients with multiple lesions have been treated successfully during this period. The treatment log files have been uploaded and documented in the software which facilitated physics peer review of treatments per the standards in place by AAPM and ACR. Results: The program model was implemented successfully at multiple sites. The cloud based software allowed for proper peer review and compliance of the program at 10 clinical sites. Dosimtery was done on 800 patients and executed in a timely fashion to suit the clinical needs. Accumulated physics data in the software from the clinics allows for robust analysis and future development. Conclusion: Electronic brachytherapy implementation experience from a quality assurance perspective was greatly enhanced by using a cloud based software. The comprehensive database will pave the way for future developments to yield superior physics outcomes.« less
Cannell, John; Jovic, Emelyn; Rathjen, Amy; Lane, Kylie; Tyson, Anna M; Callisaya, Michele L; Smith, Stuart T; Ahuja, Kiran DK; Bird, Marie-Louise
2017-01-01
Objective: To compare the efficacy of novel interactive, motion capture-rehabilitation software to usual care stroke rehabilitation on physical function. Design: Randomized controlled clinical trial. Setting: Two subacute hospital rehabilitation units in Australia. Participants: In all, 73 people less than six months after stroke with reduced mobility and clinician determined capacity to improve. Interventions: Both groups received functional retraining and individualized programs for up to an hour, on weekdays for 8–40 sessions (dose matched). For the intervention group, this individualized program used motivating virtual reality rehabilitation and novel gesture controlled interactive motion capture software. For usual care, the individualized program was delivered in a group class on one unit and by rehabilitation assistant 1:1 on the other. Main measures: Primary outcome was standing balance (functional reach). Secondary outcomes were lateral reach, step test, sitting balance, arm function, and walking. Results: Participants (mean 22 days post-stroke) attended mean 14 sessions. Both groups improved (mean (95% confidence interval)) on primary outcome functional reach (usual care 3.3 (0.6 to 5.9), intervention 4.1 (−3.0 to 5.0) cm) with no difference between groups (P = 0.69) on this or any secondary measures. No differences between the rehabilitation units were seen except in lateral reach (less affected side) (P = 0.04). No adverse events were recorded during therapy. Conclusion: Interactive, motion capture rehabilitation for inpatients post stroke produced functional improvements that were similar to those achieved by usual care stroke rehabilitation, safely delivered by either a physical therapist or a rehabilitation assistant. PMID:28719977
Capturing in vivo RNA transcriptional dynamics from the malaria parasite Plasmodium falciparum
Painter, Heather J.; Carrasquilla, Manuela; Llinás, Manuel
2017-01-01
To capture the transcriptional dynamics within proliferating cells, methods to differentiate nascent transcription from preexisting mRNAs are desired. One approach is to label newly synthesized mRNA transcripts in vivo through the incorporation of modified pyrimidines. However, the human malaria parasite, Plasmodium falciparum, is incapable of pyrimidine salvage for mRNA biogenesis. To capture cellular mRNA dynamics during Plasmodium development, we engineered parasites that can salvage pyrimidines through the expression of a single bifunctional yeast fusion gene, cytosine deaminase/uracil phosphoribosyltransferase (FCU). We show that expression of FCU allows for the direct incorporation of thiol-modified pyrimidines into nascent mRNAs. Using developmental stage-specific promoters to express FCU-GFP enables the biosynthetic capture and in-depth analysis of mRNA dynamics from subpopulations of cells undergoing differentiation. We demonstrate the utility of this method by examining the transcriptional dynamics of the sexual gametocyte stage transition, a process that is essential to malaria transmission between hosts. Using the pfs16 gametocyte-specific promoter to express FCU-GFP in 3D7 parasites, we found that sexual stage commitment is governed by transcriptional reprogramming and stabilization of a subset of essential gametocyte transcripts. We also measured mRNA dynamics in F12 gametocyte-deficient parasites and demonstrate that the transcriptional program required for sexual commitment and maturation is initiated but likely aborted due to the absence of the PfAP2-G transcriptional regulator and a lack of gametocyte-specific mRNA stabilization. Biosynthetic labeling of Plasmodium mRNAs is incredibly versatile, can be used to measure transcriptional dynamics at any stage of parasite development, and will allow for future applications to comprehensively measure RNA-protein interactions in the malaria parasite. PMID:28416533
The SmartOR: a distributed sensor network to improve operating room efficiency.
Huang, Albert Y; Joerger, Guillaume; Fikfak, Vid; Salmon, Remi; Dunkin, Brian J; Bass, Barbara L; Garbey, Marc
2017-09-01
Despite the significant expense of OR time, best practice achieves only 70% efficiency. Compounding this problem is a lack of real-time data. Most current OR utilization programs require manual data entry. Automated systems require installation and maintenance of expensive tracking hardware throughout the institution. This study developed an inexpensive, automated OR utilization system and analyzed data from multiple operating rooms. OR activity was deconstructed into four room states. A sensor network was then developed to automatically capture these states using only three sensors, a local wireless network, and a data capture computer. Two systems were then installed into two ORs, recordings captured 24/7. The SmartOR recorded the following events: any room activity, patient entry/exit time, anesthesia time, laparoscopy time, room turnover time, and time of preoperative patient identification by the surgeon. From November 2014 to December 2015, data on 1003 cases were collected. The mean turnover time was 36 min, and 38% of cases met the institutional goal of ≤30 min. Data analysis also identified outlier cases (>1 SD from mean) in the domains of time from patient entry into the OR to intubation (11% of cases) and time from extubation to patient exiting the OR (11% of cases). Time from surgeon identification of patient to scheduled procedure start time was 11 min (institution bylaws require 20 min before scheduled start time), yet OR teams required 22 min on average to bring a patient into the room after surgeon identification. The SmartOR automatically and reliably captures data on OR room state and, in real time, identifies outlier cases that may be examined closer to improve efficiency. As no manual entry is required, the data are indisputable and allow OR teams to maintain a patient-centric focus.
NASA Astrophysics Data System (ADS)
Chou, Shuo-Ju
2011-12-01
In recent years the United States has shifted from a threat-based acquisition policy that developed systems for countering specific threats to a capabilities-based strategy that emphasizes the acquisition of systems that provide critical national defense capabilities. This shift in policy, in theory, allows for the creation of an "optimal force" that is robust against current and future threats regardless of the tactics and scenario involved. In broad terms, robustness can be defined as the insensitivity of an outcome to "noise" or non-controlled variables. Within this context, the outcome is the successful achievement of defense strategies and the noise variables are tactics and scenarios that will be associated with current and future enemies. Unfortunately, a lack of system capability, budget, and schedule robustness against technology performance and development uncertainties has led to major setbacks in recent acquisition programs. This lack of robustness stems from the fact that immature technologies have uncertainties in their expected performance, development cost, and schedule that cause to variations in system effectiveness and program development budget and schedule requirements. Unfortunately, the Technology Readiness Assessment process currently used by acquisition program managers and decision-makers to measure technology uncertainty during critical program decision junctions does not adequately capture the impact of technology performance and development uncertainty on program capability and development metrics. The Technology Readiness Level metric employed by the TRA to describe program technology elements uncertainties can only provide a qualitative and non-descript estimation of the technology uncertainties. In order to assess program robustness, specifically requirements robustness, against technology performance and development uncertainties, a new process is needed. This process should provide acquisition program managers and decision-makers with the ability to assess or measure the robustness of program requirements against such uncertainties. A literature review of techniques for forecasting technology performance and development uncertainties and subsequent impacts on capability, budget, and schedule requirements resulted in the conclusion that an analysis process that coupled a probabilistic analysis technique such as Monte Carlo Simulations with quantitative and parametric models of technology performance impact and technology development time and cost requirements would allow the probabilities of meeting specific constraints of these requirements to be established. These probabilities of requirements success metrics can then be used as a quantitative and probabilistic measure of program requirements robustness against technology uncertainties. Combined with a Multi-Objective Genetic Algorithm optimization process and computer-based Decision Support System, critical information regarding requirements robustness against technology uncertainties can be captured and quantified for acquisition decision-makers. This results in a more informed and justifiable selection of program technologies during initial program definition as well as formulation of program development and risk management strategies. To meet the stated research objective, the ENhanced TEchnology Robustness Prediction and RISk Evaluation (ENTERPRISE) methodology was formulated to provide a structured and transparent process for integrating these enabling techniques to provide a probabilistic and quantitative assessment of acquisition program requirements robustness against technology performance and development uncertainties. In order to demonstrate the capabilities of the ENTERPRISE method and test the research Hypotheses, an demonstration application of this method was performed on a notional program for acquiring the Carrier-based Suppression of Enemy Air Defenses (SEAD) using Unmanned Combat Aircraft Systems (UCAS) and their enabling technologies. The results of this implementation provided valuable insights regarding the benefits and inner workings of this methodology as well as its limitations that should be addressed in the future to narrow the gap between current state and the desired state.
Visual Field Asymmetry in Attentional Capture
ERIC Educational Resources Information Center
Du, Feng; Abrams, Richard A.
2010-01-01
The present study examined the spatial distribution of involuntary attentional capture over the two visual hemi-fields. A new experiment, and an analysis of three previous experiments showed that distractors in the left visual field that matched a sought-for target in color produced a much larger capture effect than identical distractors in the…
Khavjou, Olga A; Honeycutt, Amanda A; Hoerger, Thomas J; Trogdon, Justin G; Cash, Amanda J
2014-08-01
Community-based programs require substantial investments of resources; however, evaluations of these programs usually lack analyses of program costs. Costs of community-based programs reported in previous literature are limited and have been estimated retrospectively. To describe a prospective cost data collection approach developed for the Communities Putting Prevention to Work (CPPW) program capturing costs for community-based tobacco use and obesity prevention strategies. A web-based cost data collection instrument was developed using an activity-based costing approach. Respondents reported quarterly expenditures on labor; consultants; materials, travel, and services; overhead; partner efforts; and in-kind contributions. Costs were allocated across CPPW objectives and strategies organized around five categories: media, access, point of decision/promotion, price, and social support and services. The instrument was developed in 2010, quarterly data collections took place in 2011-2013, and preliminary analysis was conducted in 2013. Preliminary descriptive statistics are presented for the cost data collected from 51 respondents. More than 50% of program costs were for partner organizations, and over 20% of costs were for labor hours. Tobacco communities devoted the majority of their efforts to media strategies. Obesity communities spent more than half of their resources on access strategies. Collecting accurate cost information on health promotion and disease prevention programs presents many challenges. The approach presented in this paper is one of the first efforts successfully collecting these types of data and can be replicated for collecting costs from other programs. Copyright © 2014 American Journal of Preventive Medicine. All rights reserved.
McConnel, M B; Galligan, D T
2004-10-01
Optimization programs are currently used to aid in the selection of bulls to be used in herd breeding programs. While these programs offer a systematic approach to the problem of semen selection, they ignore the impact of volume discounts. Volume discounts are discounts that vary depending on the number of straws purchased. The dynamic nature of volume discounts means that, in order to be adequately accounted for, they must be considered in the optimization routine. Failing to do this creates a missed economic opportunity because the potential benefits of optimally selecting and combining breeding company discount opportunities are not captured. To address these issues, an integer program was created which used binary decision variables to incorporate the effects of quantity discounts into the optimization program. A consistent set of trait criteria was used to select a group of bulls from 3 sample breeding companies. Three different selection programs were used to select the bulls, 2 traditional methods and the integer method. After the discounts were applied using each method, the integer program resulted in the lowest cost portfolio of bulls. A sensitivity analysis showed that the integer program also resulted in a low cost portfolio when the genetic trait goals were changed to be more or less stringent. In the sample application, a net benefit of the new approach over the traditional approaches was a 12.3 to 20.0% savings in semen cost.
Family Therapy Training at the Ackerman Institute: Thoughts of Form and Substance.
ERIC Educational Resources Information Center
LaPerriere, Kitty
1979-01-01
Presents the history, philosophy, and form of training at the Ackerman Institute for Family Therapy, and attempts to capture the spirit and atmosphere of the program rather than enumerate details. The program teaches family therapy and a systems perspective on human behavior to professionals who have completed basic professional training. (Author)
Measuring Quality of Delivery in a Substance Use Prevention Program
ERIC Educational Resources Information Center
Giles, Steven; Jackson-Newsom, Julia; Pankratz, Melinda M.; Hansen, William B.; Ringwalt, Christopher L.; Dusenbury, Linda
2008-01-01
The purpose of this study was to develop and validate an observation measure designed to capture teachers' use of interactive teaching skills within the delivery of the All Stars substance use prevention program. Coders counted the number of times teachers praised and encouraged students, accepted and used students' ideas, asked questions,…
Buffering Negative Impacts of Divorce on Children: Evaluating Impact of Divorce Education
ERIC Educational Resources Information Center
Crawford, Jennifer K.; Riffe, Jane; Trevisan, Dominic A.; Adesope, Olusola O.
2014-01-01
Following the call for more stringent evaluation methodology and recently documented national Extension presence in the field of divorce education for parents and children, the study reported here describes a local multi-level evaluation to capture program impact of a stakeholder-accepted divorce education program. Using a post-then-pre…
Noonan, Rita K; Gibbs, Deborah
2009-01-01
This special issue captures several threads in the ongoing evolution of sexual violence prevention. The articles that follow examine an empowerment evaluation process with four promising programs dedicated to preventing first-time male perpetration of sexual violence, as well as evaluation findings. Both the evaluation approach and the programs examined shed light on how sexual violence prevention can continue to be improved in the future.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hawkes, Lynette A.
1991-03-01
The seaward migration of salmonid smolts was monitored by the National Marine Fisheries Service (NMFS) at three sites on the Columbia River system in 1990. This project is a part of the continuing Smolt Monitoring Program to monitor Columbia Basin salmonid stocks coordinated by the Fish Passage Center (FPC) for the Columbia Basin Fish and Wildlife Agencies and Indian Tribes. It's purpose is to provide timely data to the Fish Passage Managers for in season flow and spill management for fish passage and post-season analysis for travel time, relative magnitude and timing and the smolt migration. This program is carriedmore » out under the auspices of the Northwest Power Planning Council Fish and Wildlife Program and is funded by the Bonneville Power Administration (BPA). Sampling sites were John Day and Bonneville Dams under the Smolt Monitoring program, and the Dallas Dam under the Fish Spill Memorandum of Agreement'' for 1990. All pertinent fish capture, condition and brand data, as well as dam operations and river flow data were reported daily to FPC. These data were incorporated into the FPC Fish Passage Data Information System (FPDIS). 10 refs., 8 figs., 1 tab.« less
Pandey, Anil Kumar; Saroha, Kartik; Sharma, Param Dev; Patel, Chetan; Bal, Chandrashekhar; Kumar, Rakesh
2017-01-01
In this study, we have developed a simple image processing application in MATLAB that uses suprathreshold stochastic resonance (SSR) and helps the user to visualize abdominopelvic tumor on the exported prediuretic positron emission tomography/computed tomography (PET/CT) images. A brainstorming session was conducted for requirement analysis for the program. It was decided that program should load the screen captured PET/CT images and then produces output images in a window with a slider control that should enable the user to view the best image that visualizes the tumor, if present. The program was implemented on personal computer using Microsoft Windows and MATLAB R2013b. The program has option for the user to select the input image. For the selected image, it displays output images generated using SSR in a separate window having a slider control. The slider control enables the user to view images and select one which seems to provide the best visualization of the area(s) of interest. The developed application enables the user to select, process, and view output images in the process of utilizing SSR to detect the presence of abdominopelvic tumor on prediuretic PET/CT image.
PcapDB: Search Optimized Packet Capture, Version 0.1.0.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ferrell, Paul; Steinfadt, Shannon
PcapDB is a packet capture system designed to optimize the captured data for fast search in the typical (network incident response) use case. The technology involved in this software has been submitted via the IDEAS system and has been filed as a provisional patent. It includes the following primary components: capture: The capture component utilizes existing capture libraries to retrieve packets from network interfaces. Once retrieved the packets are passed to additional threads for sorting into flows and indexing. The sorted flows and indexes are passed to other threads so that they can be written to disk. These components aremore » written in the C programming language. search: The search components provide a means to find relevant flows and the associated packets. A search query is parsed and represented as a search tree. Various search commands, written in C, are then used resolve this tree into a set of search results. The tree generation and search execution management components are written in python. interface: The PcapDB web interface is written in Python on the Django framework. It provides a series of pages, API's, and asynchronous tasks that allow the user to manage the capture system, perform searches, and retrieve results. Web page components are written in HTML,CSS and Javascript.« less
Leadership Training in Graduate Medical Education: A Systematic Review.
Sadowski, Brett; Cantrell, Sarah; Barelski, Adam; O'Malley, Patrick G; Hartzell, Joshua D
2018-04-01
Leadership is a critical component of physician competence, yet the best approaches for developing leadership skills for physicians in training remain undefined. We systematically reviewed the literature on existing leadership curricula in graduate medical education (GME) to inform leadership program development. Using the Preferred Reporting Items for Systematic Reviews and Meta-Analysis (PRISMA) guidelines, we searched MEDLINE, ERIC, EMBASE, and MedEdPORTAL through October 2015 using search terms to capture GME leadership curricula. Abstracts were reviewed for relevance, and included studies were retrieved for full-text analysis. Article quality was assessed using the Best Evidence in Medical Education (BEME) index. A total of 3413 articles met the search criteria, and 52 were included in the analysis. Article quality was low, with 21% (11 of 52) having a BEME score of 4 or 5. Primary care specialties were the most represented (58%, 30 of 52). The majority of programs were open to all residents (81%, 42 of 52). Projects and use of mentors or coaches were components of 46% and 48% of curricula, respectively. Only 40% (21 of 52) were longitudinal throughout training. The most frequent pedagogic methods were lectures, small group activities, and cases. Common topics included teamwork, leadership models, and change management. Evaluation focused on learner satisfaction and self-assessed knowledge. Longitudinal programs were more likely to be successful. GME leadership curricula are heterogeneous and limited in effectiveness. Small group teaching, project-based learning, mentoring, and coaching were more frequently used in higher-quality studies.
Casaer, Jim; De Smet, Lieven; Devos, Koen; Huysentruyt, Frank; Robertson, Peter A.; Verbeke, Tom
2018-01-01
Background Sound decisions on control actions for established invasive alien species (IAS) require information on ecological as well as socio-economic impact of the species and of its management. Cost-benefit analysis provides part of this information, yet has received relatively little attention in the scientific literature on IAS. Methods We apply a bio-economic model in a cost-benefit analysis framework to greater Canada goose Branta canadensis, an IAS with documented social, economic and ecological impacts in Flanders (northern Belgium). We compared a business as usual (BAU) scenario which involved non-coordinated hunting and egg destruction with an enhanced scenario based on a continuation of these activities but supplemented with coordinated capture of moulting birds. To assess population growth under the BAU scenario we fitted a logistic growth model to the observed pre-moult capture population. Projected damage costs included water eutrophication and damage to cultivated grasslands and were calculated for all scenarios. Management costs of the moult captures were based on a representative average of the actual cost of planning and executing moult captures. Results Comparing the scenarios with different capture rates, different costs for eutrophication and various discount rates, showed avoided damage costs were in the range of 21.15 M€ to 45.82 M€ under the moult capture scenario. The lowest value for the avoided costs applied to the scenario where we lowered the capture rate by 10%. The highest value occurred in the scenario where we lowered the real discount rate from 4% to 2.5%. Discussion The reduction in damage costs always outweighed the additional management costs of moult captures. Therefore, additional coordinated moult captures could be applied to limit the negative economic impact of greater Canada goose at a regional scale. We further discuss the strengths and weaknesses of our approach and its potential application to other IAS. PMID:29404211
Reyns, Nikolaas; Casaer, Jim; De Smet, Lieven; Devos, Koen; Huysentruyt, Frank; Robertson, Peter A; Verbeke, Tom; Adriaens, Tim
2018-01-01
Sound decisions on control actions for established invasive alien species (IAS) require information on ecological as well as socio-economic impact of the species and of its management. Cost-benefit analysis provides part of this information, yet has received relatively little attention in the scientific literature on IAS. We apply a bio-economic model in a cost-benefit analysis framework to greater Canada goose Branta canadensis , an IAS with documented social, economic and ecological impacts in Flanders (northern Belgium). We compared a business as usual (BAU) scenario which involved non-coordinated hunting and egg destruction with an enhanced scenario based on a continuation of these activities but supplemented with coordinated capture of moulting birds. To assess population growth under the BAU scenario we fitted a logistic growth model to the observed pre-moult capture population. Projected damage costs included water eutrophication and damage to cultivated grasslands and were calculated for all scenarios. Management costs of the moult captures were based on a representative average of the actual cost of planning and executing moult captures. Comparing the scenarios with different capture rates, different costs for eutrophication and various discount rates, showed avoided damage costs were in the range of 21.15 M€ to 45.82 M€ under the moult capture scenario. The lowest value for the avoided costs applied to the scenario where we lowered the capture rate by 10%. The highest value occurred in the scenario where we lowered the real discount rate from 4% to 2.5%. The reduction in damage costs always outweighed the additional management costs of moult captures. Therefore, additional coordinated moult captures could be applied to limit the negative economic impact of greater Canada goose at a regional scale. We further discuss the strengths and weaknesses of our approach and its potential application to other IAS.
Recent Simulations of the Late Stages Growth of Jupiter
NASA Technical Reports Server (NTRS)
Lissauer, Jack J.; D'Angelo, Gennaro; Hubickyj, Olenka
2012-01-01
Presented by Lissauer et al. (2009, Icarus 199, 338) are used to test the model of capture of Jupiter's irregular satellites within proto-Jupiter's distended and thermally-supported envelope. We find such capture highly unlikely, since the envelope shrinks too slowly for a large number of moons to be retained, and many of those that would be retained would orbit closer to the planet than do the observed Jovian irregulars. Our calculations do not address (and therefore do not exclude) the possibility that the irregular satellites were captured as a result of gas drag within a circumjovian disk. Support for this research from NASA Outer Planets Research Program is gratefully acknowledged.
Yoon, Junghyo; Yoon, Hee-Sook; Shin, Yoojin; Kim, Sanghyun; Ju, Youngjun; Kim, Jungbae; Chung, Seok
2017-07-01
Electrospun and ethanol-dispersed polystyrene-poly(styrene-co-maleic anhydride) (PS-PSMA) nanofibers (NFs) were used as a platform for the selective capture and three-dimensional culture of EpCAM-positive cells in cell culture medium and whole blood. The NFs were treated with streptavidin to facilitate bond formation between the amino groups of streptavidin and the maleic anhydride groups of the NFs. A biotinylated anti-EpCAM monoclonal antibody (mAb) was attached to the streptavidin-conjugated NFs via the selective binding of streptavidin and biotin. Upon simple mixing and shaking with EpCAM-positive cancer cells in a wide concentration range from 10 to 1000,000 cells per 10mL, the mAb-attached NFs (mAb-NFs) captured the Ep-CAM positive cells in an efficiency of 59%-67% depending on initial cell concentrations, with minor mechanical capture of 14%-36%. Captured cells were directly cultured, forming cell aggregates, in the NF matrix, which ensures the cell proliferation and follow-up analysis. Furthermore, the capture capacity of mAb-NFs was assessed in the presence of whole blood and blood lysates, indicating cluster formation that captured target cells. It is anticipated that the antibody-attached NFs can be employed for the capture and analysis of very rare EpCAM positive circulating cancer cells. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Boaggio, K.; Bandamede, M.; Bancroft, L.; Hurler, K.; Magee, N. B.
2016-12-01
We report on details of continuing instrument development and deployment of a novel balloon-borne device for capturing and characterizing atmospheric ice and aerosol particles, the Ice Cryo Encapsulator by Balloon (ICE-Ball). The device is designed to capture and preserve cirrus ice particles, maintaining them at cold equilibrium temperatures, so that high-altitude particles can recovered, transferred intact, and then imaged under SEM at an unprecedented resolution (approximately 3 nm maximum resolution). In addition to cirrus ice particles, high altitude aerosol particles are also captured, imaged, and analyzed for geometry, chemical composition, and activity as ice nucleating particles. Prototype versions of ICE-Ball have successfully captured and preserved high altitude ice particles and aerosols, then returned them for recovery and SEM imaging and analysis. New improvements include 1) ability to capture particles from multiple narrowly-defined altitudes on a single payload, 2) high quality measurements of coincident temperature, humidity, and high-resolution video at capture altitude, 3) ability to capture particles during both ascent and descent, 4) better characterization of particle collection volume and collection efficiency, and 5) improved isolation and characterization of capture-cell cryo environment. This presentation provides detailed capability specifications for anyone interested in using measurements, collaborating on continued instrument development, or including this instrument in ongoing or future field campaigns.
Schlieren Technique Applied to Magnetohydrodynamic Generator Plasma Torch
NASA Astrophysics Data System (ADS)
Chopra, Nirbhav; Pearcy, Jacob; Jaworski, Michael
2017-10-01
Magnetohydrodynamic (MHD) generators are a promising augmentation to current hydrocarbon based combustion schemes for creating electrical power. In recent years, interest in MHD generators has been revitalized due to advances in a number of technologies such as superconducting magnets, solid-state power electronics and materials science as well as changing economics associated with carbon capture, utilization, and sequestration. We use a multi-wavelength schlieren imaging system to evaluate electron density independently of gas density in a plasma torch under conditions relevant to MHD generators. The sensitivity and resolution of the optical system are evaluated alongside the development of an automated analysis and calibration program in Python. Preliminary analysis shows spatial resolutions less than 1mm and measures an electron density of ne = 1 ×1016 cm-3 in an atmospheric microwave torch. Work supported by DOE contract DE-AC02-09CH11466.
A Computational Approach for Probabilistic Analysis of Water Impact Simulations
NASA Technical Reports Server (NTRS)
Horta, Lucas G.; Mason, Brian H.; Lyle, Karen H.
2009-01-01
NASA's development of new concepts for the Crew Exploration Vehicle Orion presents many similar challenges to those worked in the sixties during the Apollo program. However, with improved modeling capabilities, new challenges arise. For example, the use of the commercial code LS-DYNA, although widely used and accepted in the technical community, often involves high-dimensional, time consuming, and computationally intensive simulations. The challenge is to capture what is learned from a limited number of LS-DYNA simulations to develop models that allow users to conduct interpolation of solutions at a fraction of the computational time. This paper presents a description of the LS-DYNA model, a brief summary of the response surface techniques, the analysis of variance approach used in the sensitivity studies, equations used to estimate impact parameters, results showing conditions that might cause injuries, and concluding remarks.
International Research Results and Accomplishments From the International Space Station
NASA Technical Reports Server (NTRS)
Ruttley, Tara M.; Robinson, Julie A.; Tate-Brown, Judy; Perkins, Nekisha; Cohen, Luchino; Marcil, Isabelle; Heppener, Marc; Hatton, Jason; Tasaki, Kazuyuki; Umemura, Sayaka;
2016-01-01
In 2016, the International Space Station (ISS) partnership published the first-ever compilation of international ISS research publications resulting from research performed on the ISS through 2011. The International Space Station Research Accomplishments: An Analysis of Results From 2000-2011 is a collection of summaries of over 1,200 journal publications that describe ISS research in the areas of biology and biotechnology; Earth and space science; educational activities and outreach; human research; physical sciences; technology development and demonstration; and, results from ISS operations. This paper will summarize the ISS results publications obtained through 2011 on behalf of the ISS Program Science Forum that is made up of senior science representatives across the international partnership. NASA's ISS Program Science office maintains an online experiment database (www.nasa.gov/issscience) that tracks and communicates ISS research activities across the entire ISS partnership, and it is continuously updated. It captures ISS experiment summaries and results and includes citations to the journals, conference proceedings, and patents as they become available. The International Space Station Research Accomplishments: An Analysis of Results From 2000-2011 is a testament to the research that was underway even as the ISS laboratory was being built. It reflects the scientific knowledge gained from ISS research, and how it impact the fields of science in both space and traditional science disciplines on Earth. Now, during a time when utilization is at its busiest, and with extension of the ISS through at least 2024, the ISS partners work together to track the accomplishments and the new knowledge gained in a way that will impact humanity like no laboratory on Earth. The ISS Program Science Forum will continue to capture and report on these results in the form of journal publications, conference proceedings, and patents. We anticipate that successful ISS research will continue to contribute to the science literature in a way that helps to formulate new hypotheses and conclusions that will enable science advancements across a wide range of scientific disciplines both in space and on Earth.
Motion capture based identification of the human body inertial parameters.
Venture, Gentiane; Ayusawa, Ko; Nakamura, Yoshihiko
2008-01-01
Identification of body inertia, masses and center of mass is an important data to simulate, monitor and understand dynamics of motion, to personalize rehabilitation programs. This paper proposes an original method to identify the inertial parameters of the human body, making use of motion capture data and contact forces measurements. It allows in-vivo painless estimation and monitoring of the inertial parameters. The method is described and then obtained experimental results are presented and discussed.
The Quest CCS Project - MMV Technology Deployment Through Two Years of Operation
NASA Astrophysics Data System (ADS)
O'Brien, S.
2017-12-01
In September 2012, Shell, on behalf of the Athabasca Oil Sands Project venture (Shell Canada Energy, Chevron Canada Limited, Marathon Oil Canada Corporation), announced that it was proceeding to construct the Quest Carbon Capture and Storage (CCS) project near Fort Saskatchewan. Quest is the world's first large-scale commercial application of CCS at an oil sands operation, and it is now capturing more than one million tonnes of CO2 per year from the Scotford Upgrader. It is a fully integrated project, involving CO2 capture at the bitumen upgrader, transportation along a 65 km pipeline, and CO2 storage in a deep saline aquifer (the Basal Cambrian Sands). Construction was completed in August 2015, and the Quest project was certified for commercial operation in September 2015. The Measurement, Monitoring and Verification (MMV) program for Quest is comprehensive, with a variety of technologies being used to monitor the atmosphere, hydrosphere, biosphere and geosphere. These include a Lightsource system for atmospheric monitoring, extensive groundwater sampling, DAS VSPs to assess the development of the CO2 plume, a microseismic array to measure any induced seismic activity, and temperature and pressure gauges for reservoir monitoring. Over two years of operations, this program has been optimized to address key risks while improving operational efficiency. Quest has now successfully captured and stored more than 2 million tonnes of CO2 with no MMV indications of any storage issues.
Stability of a slotted ALOHA system with capture effect
NASA Astrophysics Data System (ADS)
Onozato, Yoshikuni; Liu, Jin; Noguchi, Shoichi
1989-02-01
The stability of a slotted ALOHA system with capture effect is investigated under a general communication environment where terminals are divided into two groups (low-power and high-power) and the capture effect is modeled by capture probabilities. An approximate analysis is developed using catastrophe theory, in which the effects of system and user parameters on the stability are characterized by the cusp catastrophe. Particular attention is given to the low-power group, since it must bear the strain under the capture effect. The stability conditions of the two groups are given explicitly by bifurcation sets.
Orbital express capture system: concept to reality
NASA Astrophysics Data System (ADS)
Stamm, Shane; Motaghedi, Pejmun
2004-08-01
The development of autonomous servicing of on-orbit spacecraft has been a sought after objective for many years. A critical component of on-orbit servicing involves the ability to successfully capture, institute mate, and perform electrical and fluid transfers autonomously. As part of a Small Business Innovation Research (SBIR) grant, Starsys Research Corporation (SRC) began developing such a system. Phase I of the grant started in 1999, with initial work focusing on simultaneously defining the parameters associated with successful docking while designing to those parameters. Despite the challenge of working without specific requirements, SRC completed development of a prototype design in 2000. Throughout the following year, testing was conducted on the prototype to characterize its performance. Having successfully completed work on the prototype, SRC began a Phase II SBIR effort in mid-2001. The focus of the second phase was a commercialization effort designed to augment the prototype model into a more flight-like design. The technical requirements, however, still needed clear definition for the design to progress. The advent of the Orbital Express (OE) program provided much of that definition. While still in the proposal stages of the OE program, SRC began tailoring prototype redesign efforts to the OE program requirements. A primary challenge involved striking a balance between addressing the technical requirements of OE while designing within the scope of the SBIR. Upon award of the OE contract, the Phase II SBIR design has been fully developed. This new design, designated the Mechanical Docking System (MDS), successfully incorporated many of the requirements of the OE program. SRC is now completing dynamic testing on the MDS hardware, with a parallel effort of developing a flight design for OE. As testing on the MDS progresses, the design path that was once common to both SBIR effort and the OE program begins to diverge. The MDS will complete the scope of the Phase II SBIR work, while the new mechanism, the Orbital Express Capture System, will emerge as a flight-qualified design for the Orbital Express program.
Clark, S.G.; Rutherford, M.B.; Auer, M.R.; Cherney, D.N.; Wallace, R.L.; Mattson, D.J.; Clark, D.A.; Foote, L.; Krogman, N.; Wilshusen, P.; Steelman, T.
2011-01-01
Environmental studies and environmental sciences programs in American and Canadian colleges and universities seek to ameliorate environmental problems through empirical enquiry and analytic judgment. In a companion article (Part 1) we describe the environmental program movement (EPM) and discuss factors that have hindered its performance. Here, we complete our analysis by proposing strategies for improvement. We recommend that environmental programs re-organize around three principles. First, adopt as an overriding goal the concept of human dignity-defined as freedom and social justice in healthy, sustainable environments. This clear higher-order goal captures the human and environmental aspirations of the EPM and would provide a more coherent direction for the efforts of diverse participants. Second, employ an explicit, genuinely interdisciplinary analytical framework that facilitates the use of multiple methods to investigate and address environmental and social problems in context. Third, develop educational programs and applied experiences that provide students with the technical knowledge, powers of observation, critical thinking skills and management acumen required for them to become effective professionals and leaders. Organizing around these three principles would build unity in the EPM while at the same time capitalizing on the strengths of the many disciplines and diverse local conditions involved. ?? 2011 Springer Science+Business Media, LLC.
Mendes-de-Almeida, Flavya; Remy, Gabriella L; Gershony, Liza C; Rodrigues, Daniela P; Chame, Marcia; Labarthe, Norma V
2011-06-01
The size of urban cat colonies is limited only by the availability of food and shelter; therefore, their population growth challenges all known population control programs. To test a new population control method, a free-roaming feral cat colony at the Zoological Park in the city of Rio de Janeiro was studied, beginning in 2001. The novel method consisted of performing a hysterectomy on all captured female cats over 6 months of age. To estimate the size of the colony and compare population from year to year, a method of capture-mark-release-recapture was used. The aim was to capture as many individuals as possible, including cats of all ages and gender to estimate numbers of cats in all population categories. Results indicated that the feral cat population remained constant from 2001 to 2004. From 2004 to 2008, the hysterectomy program and population estimates were performed every other year (2006 and 2008). The population was estimated to be 40 cats in 2004, 26 in 2006, and 17 cats in 2008. Although pathogens tend to infect more individuals as the population grows older and maintains natural behavior, these results show that free-roaming feral cat colonies could have their population controlled by a biannual program that focuses on hysterectomy of sexually active female cats. Copyright © 2011 ISFM and AAFP. Published by Elsevier Ltd. All rights reserved.
Two-step rapid sulfur capture. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1994-04-01
The primary goal of this program was to test the technical and economic feasibility of a novel dry sorbent injection process called the Two-Step Rapid Sulfur Capture process for several advanced coal utilization systems. The Two-Step Rapid Sulfur Capture process consists of limestone activation in a high temperature auxiliary burner for short times followed by sorbent quenching in a lower temperature sulfur containing coal combustion gas. The Two-Step Rapid Sulfur Capture process is based on the Non-Equilibrium Sulfur Capture process developed by the Energy Technology Office of Textron Defense Systems (ETO/TDS). Based on the Non-Equilibrium Sulfur Capture studies the rangemore » of conditions for optimum sorbent activation were thought to be: activation temperature > 2,200 K for activation times in the range of 10--30 ms. Therefore, the aim of the Two-Step process is to create a very active sorbent (under conditions similar to the bomb reactor) and complete the sulfur reaction under thermodynamically favorable conditions. A flow facility was designed and assembled to simulate the temperature, time, stoichiometry, and sulfur gas concentration prevalent in the advanced coal utilization systems such as gasifiers, fluidized bed combustors, mixed-metal oxide desulfurization systems, diesel engines, and gas turbines.« less
Inferring rules of lineage commitment in haematopoiesis.
Pina, Cristina; Fugazza, Cristina; Tipping, Alex J; Brown, John; Soneji, Shamit; Teles, Jose; Peterson, Carsten; Enver, Tariq
2012-02-19
How the molecular programs of differentiated cells develop as cells transit from multipotency through lineage commitment remains unexplored. This reflects the inability to access cells undergoing commitment or located in the immediate vicinity of commitment boundaries. It remains unclear whether commitment constitutes a gradual process, or else represents a discrete transition. Analyses of in vitro self-renewing multipotent systems have revealed cellular heterogeneity with individual cells transiently exhibiting distinct biases for lineage commitment. Such systems can be used to molecularly interrogate early stages of lineage affiliation and infer rules of lineage commitment. In haematopoiesis, population-based studies have indicated that lineage choice is governed by global transcriptional noise, with self-renewing multipotent cells reversibly activating transcriptome-wide lineage-affiliated programs. We examine this hypothesis through functional and molecular analysis of individual blood cells captured from self-renewal cultures, during cytokine-driven differentiation and from primary stem and progenitor bone marrow compartments. We show dissociation between self-renewal potential and transcriptome-wide activation of lineage programs, and instead suggest that multipotent cells experience independent activation of individual regulators resulting in a low probability of transition to the committed state.
Status of the Neutron Capture Measurement on 237Np with the DANCE Array at LANSCE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Esch, E.-I.; Bond, E.M.; Bredeweg, T. A.
2005-05-24
Neptunium-237 is a major constituent of spent nuclear fuel. Estimates place the amount of 237Np bound for the Yucca Mountain high-level waste repository at 40 metric tons. The Department of Energy's Advanced Fuel Cycle Initiative program is evaluating methods for transmuting the actinide waste that will be generated by future operation of commercial nuclear power plants. The critical parameter that defines the transmutation efficiency of actinide isotopes is the neutron fission-to-capture ratio for the particular isotope in a given neutron spectrum. The calculation of transmutation efficiency therefore requires accurate fission and capture cross sections. Current 237Np evaluations available for transmutermore » system studies show significant discrepancies in both the fission and capture cross sections in the energy regions of interest. Herein we report on 237Np (n,{gamma}) measurements using the recently commissioned DANCE array.« less
Self-Assembly, Guest Capture, and NMR Spectroscopy of a Metal-Organic Cage in Water
ERIC Educational Resources Information Center
Go, Eun Bin; Srisuknimit, Veerasak; Cheng, Stephanie L.; Vosburg, David A.
2016-01-01
A green organic-inorganic laboratory experiment has been developed in which students prepare a self-assembling iron cage in D[subscript 2]O at room temperature. The tetrahedral cage captures a small, neutral molecule such as cyclohexane or tetrahydrofuran. [Superscript 1]H NMR analysis distinguishes captured and free guests through diagnostic…
Nenoff, Tina M.; Rodriguez, Mark A.; Soelberg, Nick R.; ...
2014-05-09
The selective capture of radiological iodine ( 129I) is a persistent concern for safe nuclear energy. In these nuclear fuel reprocessing scenarios, the gas streams to be treated are extremely complex, containing several distinct iodine-containing molecules amongst a large variety of other species. Silver-containing mordenite (MOR) is a longstanding benchmark for radioiodine capture, reacting with molecular iodine (I 2) to form AgI. However the mechanisms for organoiodine capture is not well understood. Here we investigate the capture of methyl iodide from complex mixed gas streams by combining chemical analysis of the effluent gas stream with in depth characterization of themore » recovered sorbent. Tools applied include infrared spectroscopy, thermogravimetric analysis with mass spectrometry, micro X-ray fluorescence, powder X-ray diffraction analysis, and pair distribution function analysis. Moreover, the MOR zeolite catalyzes decomposition of the methyl iodide through formation of surface methoxy species (SMS), which subsequently reacts with water in the mixed gas stream to form methanol, and with methanol to form dimethyl ether, which are both detected downstream in the effluent. The liberated iodine reacts with Ag in the MOR pore to the form subnanometer AgI clusters, smaller than the MOR pores, suggesting that the iodine is both physically and chemically confined within the zeolite.« less
NASA Astrophysics Data System (ADS)
Morse, Brad S.; Pohll, Greg; Huntington, Justin; Rodriguez Castillo, Ramiro
2003-06-01
In 1992, Mexican researchers discovered concentrations of arsenic in excess of World Heath Organization (WHO) standards in several municipal wells in the Zimapan Valley of Mexico. This study describes a method to delineate a capture zone for one of the most highly contaminated wells to aid in future well siting. A stochastic approach was used to model the capture zone because of the high level of uncertainty in several input parameters. Two stochastic techniques were performed and compared: "standard" Monte Carlo analysis and the generalized likelihood uncertainty estimator (GLUE) methodology. The GLUE procedure differs from standard Monte Carlo analysis in that it incorporates a goodness of fit (termed a likelihood measure) in evaluating the model. This allows for more information (in this case, head data) to be used in the uncertainty analysis, resulting in smaller prediction uncertainty. Two likelihood measures are tested in this study to determine which are in better agreement with the observed heads. While the standard Monte Carlo approach does not aid in parameter estimation, the GLUE methodology indicates best fit models when hydraulic conductivity is approximately 10-6.5 m/s, with vertically isotropic conditions and large quantities of interbasin flow entering the basin. Probabilistic isochrones (capture zone boundaries) are then presented, and as predicted, the GLUE-derived capture zones are significantly smaller in area than those from the standard Monte Carlo approach.
Járvás, Gábor; Varga, Tamás; Szigeti, Márton; Hajba, László; Fürjes, Péter; Rajta, István; Guttman, András
2018-02-01
As a continuation of our previously published work, this paper presents a detailed evaluation of a microfabricated cell capture device utilizing a doubly tilted micropillar array. The device was fabricated using a novel hybrid technology based on the combination of proton beam writing and conventional lithography techniques. Tilted pillars offer unique flow characteristics and support enhanced fluidic interaction for improved immunoaffinity based cell capture. The performance of the microdevice was evaluated by an image sequence analysis based in-house developed single-cell tracking system. Individual cell tracking allowed in-depth analysis of the cell-chip surface interaction mechanism from hydrodynamic point of view. Simulation results were validated by using the hybrid device and the optimized surface functionalization procedure. Finally, the cell capture capability of this new generation microdevice was demonstrated by efficiently arresting cells from a HT29 cell-line suspension. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Pre-Combustion Carbon Capture by a Nanoporous, Superhydrophobic Membrane Contactor Process
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meyer, Howard; Zhou, S James; Ding, Yong
2012-03-31
This report summarizes progress made during Phase I and Phase II of the project: "Pre-Combustion Carbon Capture by a Nanoporous, Superhydrophobic Membrane Contactor Process," under contract DE-FE-0000646. The objective of this project is to develop a practical and cost effective technology for CO{sub 2} separation and capture for pre-combustion coal-based gasification plants using a membrane contactor/solvent absorption process. The goals of this technology development project are to separate and capture at least 90% of the CO{sub 2} from Integrated Gasification Combined Cycle (IGCC) power plants with less than 10% increase in the cost of energy services. Unlike conventional gas separationmore » membranes, the membrane contactor is a novel gas separation process based on the gas/liquid membrane concept. The membrane contactor is an advanced mass transfer device that operates with liquid on one side of the membrane and gas on the other. The membrane contactor can operate with pressures that are almost the same on both sides of the membrane, whereas the gas separation membranes use the differential pressure across the membrane as driving force for separation. The driving force for separation for the membrane contactor process is the chemical potential difference of CO{sub 2} in the gas phase and in the absorption liquid. This process is thus easily tailored to suit the needs for pre-combustion separation and capture of CO{sub 2}. Gas Technology Institute (GTI) and PoroGen Corporation (PGC) have developed a novel hollow fiber membrane technology that is based on chemically and thermally resistant commercial engineered polymer poly(ether ether ketone) or PEEK. The PEEK membrane material used in the membrane contactor during this technology development program is a high temperature engineered plastic that is virtually non-destructible under the operating conditions encountered in typical gas absorption applications. It can withstand contact with most of the common treating solvents. GTI and PGC have developed a nanoporous and superhydrophobic PEEK-based hollow fiber membrane contactor tailored for the membrane contactor/solvent absorption application for syngas cleanup. The membrane contactor modules were scaled up to 8-inch diameter commercial size modules. We have performing extensive laboratory and bench testing using pure gases, simulated water-gas-shifted (WGS) syngas stream, and a slipstream from a gasification derived syngas from GTI's Flex-Fuel Test Facility (FFTF) gasification plant under commercially relevant conditions. The team have also carried out an engineering and economic analysis of the membrane contactor process to evaluate the economics of this technology and its commercial potential. Our test results have shown that 90% CO{sub 2} capture can be achieved with several physical solvents such as water and chilled methanol. The rate of CO{sub 2} removal by the membrane contactor is in the range of 1.5 to 2.0 kg/m{sup 2}/hr depending on the operating pressures and temperatures and depending on the solvents used. The final economic analysis has shown that the membrane contactor process will cause the cost of electricity to increase by 21% from the base plant without CO{sub 2} capture. The goal of 10% increase in levelized cost of electricity (LCOE) from base DOE Case 1(base plant without capture) is not achieved by using the membrane contactor. However, the 21% increase in LCOE is a substantial improvement as compared with the 31.6% increase in LCOE as in DOE Case 2(state of art capture technology using 2-stages of Selexol{TM}).« less
Cognitive Task Analysis for Instruction in Single-Injection Ultrasound Guided-Regional Anesthesia
ERIC Educational Resources Information Center
Gucev, Gligor V.
2012-01-01
Cognitive task analysis (CTA) is methodology for eliciting knowledge from subject matter experts. CTA has been used to capture the cognitive processes, decision-making, and judgments that underlie expert behaviors. A review of the literature revealed that CTA has not yet been used to capture the knowledge required to perform ultrasound guided…
Ratchford, Andria M; Hamman, Richard F; Regensteiner, Judith G; Magid, David J; Gallagher, Stacy Brennan; Merenich, John A
2004-01-01
Poor rates of participation in cardiac rehabilitation programs are well documented, especially among women and older patients. The Colorado Kaiser Permanente Cardiac Rehabilitation (KPCR) program is a home-based, case-managed, goal-oriented program with an active recruitment process and unlimited program length. This study evaluated the participation rates for the program and the predictors of attendance and graduation. Patients hospitalized with acute myocardial infarction, coronary artery bypass graft, and percutaneous coronary intervention from June 1999 to May 2000 (n = 1030) were identified from the administrative database, and the proportion captured by the KPCR staff was determined. Subsequent attendance and graduation patterns were evaluated. Nearly 94% of patients with one of the three aforementioned conditions were identified by the rehabilitation staff, and 41% of all patients attended the KPCR program. More than 75% of the patients who participated went on to graduate from the program. Gender comparisons showed no difference in participation between men (66.8%) and women (59.7%) (P =.07). Participation rates were inversely associated with age, yet age was not associated with graduation from the program. Surgical interventions and two or more events experienced within the first 4 weeks of the index event were the strongest predictors of attendance and graduation from the KPCR program. Innovative approaches for the capture and retention of patients in cardiac rehabilitation programs are urgently needed. The alternative program evaluated in this study showed little difference in participation between men and women, yet participation among older patients remained poor. Overall, patients who underwent surgical interventions or multiple events were more likely to attend and graduate from the program.
Link, William A.; Barker, Richard J.
2005-01-01
We present a hierarchical extension of the Cormack–Jolly–Seber (CJS) model for open population capture–recapture data. In addition to recaptures of marked animals, we model first captures of animals and losses on capture. The parameter set includes capture probabilities, survival rates, and birth rates. The survival rates and birth rates are treated as a random sample from a bivariate distribution, thus the model explicitly incorporates correlation in these demographic rates. A key feature of the model is that the likelihood function, which includes a CJS model factor, is expressed entirely in terms of identifiable parameters; losses on capture can be factored out of the model. Since the computational complexity of classical likelihood methods is prohibitive, we use Markov chain Monte Carlo in a Bayesian analysis. We describe an efficient candidate-generation scheme for Metropolis–Hastings sampling of CJS models and extensions. The procedure is illustrated using mark-recapture data for the moth Gonodontis bidentata.
NASA Technical Reports Server (NTRS)
Knauber, R. N.
1982-01-01
A FORTRAN coded computer program which computes the capture transient of a launch vehicle upper stage at the ignition and/or separation event is presented. It is for a single degree-of-freedom on-off reaction jet attitude control system. The Monte Carlo method is used to determine the statistical value of key parameters at the outcome of the event. Aerodynamic and booster induced disturbances, vehicle and control system characteristics, and initial conditions are treated as random variables. By appropriate selection of input data pitch, yaw and roll axes can be analyzed. Transient response of a single deterministic case can be computed. The program is currently set up on a CDC CYBER 175 computer system but is compatible with ANSI FORTRAN computer language. This routine has been used over the past fifteen (15) years for the SCOUT Launch Vehicle and has been run on RECOMP III, IBM 7090, IBM 360/370, CDC6600 and CDC CYBER 175 computers with little modification.
1989-02-01
which capture the knowledge of such experts. These Expert Systems, or Knowledge-Based Systems’, differ from the usual computer programming techniques...their applications in the fields of structural design and welding is reviewed. 5.1 Introduction Expert Systems, or KBES, are computer programs using Al...procedurally constructed as conventional computer programs usually are; * The knowledge base of such systems is executable, unlike databases 3 "Ill
Auspitz, Mark; Cleghorn, Michelle C; Tse, Alvina; Sockalingam, Sanjeev; Quereshy, Fayez A; Okrainec, Allan; Jackson, Timothy D
2015-01-01
Review of surgical complications in traditional morbidity and mortality (M&M) rounds remains an important mechanism to identify and discuss quality-of-care issues. This process relies on case selection by providers; therefore, complications identified for review may differ from those captured in comprehensive quality programs such as the American College of Surgeons National Surgical Quality Improvement Program (ACS NSQIP). Additionally, although the ACS NSQIP captures robust data on complications in surgical wards, without strategies to disseminate this information to staff and improve practice, minimal change may result. The objective of this study was to compare complications identified by the ACS NSQIP with those captured in M&M conferences at a large Canadian academic hospital. Retrospective medical record reviews of all patients admitted to the general surgery unit from March 2012 to March 2013 were reviewed. Number and types of complications were recorded for cases that were both submitted and reviewed in M&M rounds and those cases that were submitted but not reviewed. These complications were compared with those extracted from our local ACS NSQIP database. A total of 1348 general surgical procedures were performed. The ACS NSQIP captured complications in 143 patients compared with 58 patients identified for review in M&M rounds. Both the methods identified similar proportions of major and minor complications (ACS NSQIP 52% major, 48% minor; M&M 58% major, 42% minor). More postoperative deaths were entered into the ACS NSQIP (12) than in M&M conferences (8 reviewed and 2 submitted). The ACS NSQIP identified higher proportions of surgical site infections and readmissions. However, M&M conferences captured additional complications in patients who did not undergo surgery and identified potential quality issues in patients who did not ultimately experience an adverse outcome. M&M rounds and the ACS NSQIP provide important and potentially complementary data on surgical quality. Incorporating the ACS NSQIP outcomes data into traditional M&M conferences may help to optimize quality improvement efforts. Copyright © 2015 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
Poetic Expressions: Students of Color Express Resiliency through Metaphors and Similes
ERIC Educational Resources Information Center
Hall, Horace R.
2007-01-01
The after-school City School Outreach youth program captured the attention of high school male students by offering them a physically and psychologically safe environment to talk about issues they faced. The students of color who attended the program used various forms of creative written expression (i.e., poetry, spoken word, and hip hop) to…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-19
... designed and intended, we committed to a full and final evaluation of the program. We have now completed... its sea otter research project and would no longer be able to assist if we resumed capturing sea.... In light of our inability to implement the translocation program as designed and intended, we...
ERIC Educational Resources Information Center
Van Offelen, Sara; Sherman, Shelley; May, Jill; Rhodes, Felisha
2011-01-01
A focus group of Somali immigrants was conducted as part of a larger study of underserved communities in Minnesota. The goal was to capture Somali women's personal experiences and views on nutrition. This understanding assists Health and Nutrition educators in assessing the quality and effectiveness of current programming efforts and making…
(Almost) Word for Word: As Voice Recognition Programs Improve, Students Reap the Benefits
ERIC Educational Resources Information Center
Smith, Mark
2006-01-01
Voice recognition software is hardly new--attempts at capturing spoken words and turning them into written text have been available to consumers for about two decades. But what was once an expensive and highly unreliable tool has made great strides in recent years, perhaps most recognized in programs such as Nuance's Dragon NaturallySpeaking…
ERIC Educational Resources Information Center
Huang, Denise; Cho, Jamie; Nam, Hannah H.; La Torre, Deborah; Oh, Christine; Harven, Aletha; Huber, Lindsay Perez; Rudo, Zena; Caverly, Sarah
2010-01-01
This study describes how staff qualifications, decisions on staffing procedures, and professional development opportunities support the recruitment and retention of quality staff members. Four high-functioning programs were identified. Qualitative procedures and instruments were designed to capture staff and parents' academic perspectives about…
WhAEM2000 is computer program that solves steady state ground-water flow and advective streamlines in homogeneous, single layer aquifers. The program was designed for capture zone delineation in support of protection of the source water area surrounding public water supply well...
ERIC Educational Resources Information Center
Mitchell, Alison; Baron, Lauren; Macaruso, Paul
2018-01-01
Screening and monitoring student reading progress can be costly and time consuming. Assessment embedded within the context of online instructional programs can capture ongoing student performance data while limiting testing time outside of instruction. This paper presents two studies that examined the validity of using performance measures from a…
Perceptions of L1 Glossed Feedback in Automated Writing Evaluation: A Case Study
ERIC Educational Resources Information Center
Wilken, Jayme Lynn
2018-01-01
Learner perceptions toward and utilization of L1 glossed feedback in an automated writing evaluation (AWE) program were investigated in an Intensive English Program (IEP) class. This small case study focused on two Chinese students who responded to weekly surveys, semi-structured interviews, and screen capture videos of their revisions over a…
Tissue culture of conifer seedlings-20 years on: Viewed through the lens of seedling quality
Steven C. Grossnickle
2011-01-01
Operational vegetative propagation systems provide a means of bringing new genetic material into forestry programs through the capture of a greater proportion of the genetic gain inherent within a selected tree species. Vegetative propagation systems also provide a method for multiplying superior varieties and/or families identified in tree improvement programs. Twenty...
Obstacle Characterization in a Geocrowdsourced Accessibility System
NASA Astrophysics Data System (ADS)
Qin, H.; Aburizaiza, A. O.; Rice, R. M.; Paez, F.; Rice, M. T.
2015-08-01
Transitory obstacles - random, short-lived and unpredictable objects - are difficult to capture in any traditional mapping system, yet they have significant negative impacts on the accessibility of mobility- and visually-impaired individuals. These transitory obstacles include sidewalk obstructions, construction detours, and poor surface conditions. To identify these obstacles and assist the navigation of mobility- and visually- impaired individuals, crowdsourced mapping applications have been developed to harvest and analyze the volunteered obstacles reports from local students, faculty, staff, and residents. In this paper, we introduce a training program designed and implemented for recruiting and motivating contributors to participate in our geocrowdsourced accessibility system, and explore the quality of geocrowdsourced data with a comparative analysis methodology.
System design optimization for stand-alone photovoltaic systems sizing by using superstructure model
NASA Astrophysics Data System (ADS)
Azau, M. A. M.; Jaafar, S.; Samsudin, K.
2013-06-01
Although the photovoltaic (PV) systems have been increasingly installed as an alternative and renewable green power generation, the initial set up cost, maintenance cost and equipment mismatch are some of the key issues that slows down the installation in small household. This paper presents the design optimization of stand-alone photovoltaic systems using superstructure model where all possible types of technology of the equipment are captured and life cycle cost analysis is formulated as a mixed integer programming (MIP). A model for investment planning of power generation and long-term decision model are developed in order to help the system engineer to build a cost effective system.
Study on the keV neutron capture reaction in 56Fe and 57Fe
NASA Astrophysics Data System (ADS)
Wang, Taofeng; Lee, Manwoo; Kim, Guinyun; Ro, Tae-Ik; Kang, Yeong-Rok; Igashira, Masayuki; Katabuchi, Tatsuya
2014-03-01
The neutron capture cross-sections and the radiative capture gamma-ray spectra from the broad resonances of 56Fe and 57Fe in the neutron energy range from 10 to 90keV and 550keV have been measured with an anti-Compton NaI(Tl) detector. Pulsed keV neutrons were produced from the 7Li 7Be reaction by bombarding the lithium target with the 1.5ns bunched proton beam from the 3MV Pelletron accelerator. The incident neutron spectrum on a capture sample was measured by means of a time-of-flight (TOF) method with a 6Li -glass detector. The number of weighted capture counts of the iron or gold sample was obtained by applying a pulse height weighting technique to the corresponding capture gamma-ray pulse height spectrum. The neutron capture gamma-ray spectra were obtained by unfolding the observed capture gamma-ray pulse height spectra. To achieve further understanding on the mechanism of neutron radiative capture reaction and study on physics models, theoretical calculations of the -ray spectra for 56Fe and 57Fe with the POD program have been performed by applying the Hauser-Feshbach statistical model. The dominant ingredients to perform the statistical calculation were the Optical Model Potential (OMP), the level densities described by the Mengoni-Nakajima approach, and the -ray transmission coefficients described by -ray strength functions. The comparison of the theoretical calculations, performed only for the 550keV point, show a good agreement with the present experimental results.
Shipboard Analytical Capabilities on the Renovated JOIDES Resolution, IODP Riserless Drilling Vessel
NASA Astrophysics Data System (ADS)
Blum, P.; Foster, P.; Houpt, D.; Bennight, C.; Brandt, L.; Cobine, T.; Crawford, W.; Fackler, D.; Fujine, K.; Hastedt, M.; Hornbacher, D.; Mateo, Z.; Moortgat, E.; Vasilyev, M.; Vasilyeva, Y.; Zeliadt, S.; Zhao, J.
2008-12-01
The JOIDES Resolution (JR) has conducted 121 scientific drilling expeditions during the Ocean Drilling Program (ODP) and the first phase of the Integrated Ocean Drilling Program (IODP) (1983-2006). The vessel and scientific systems have just completed an NSF-sponsored renovation (2005-2008). Shipboard analytical systems have been upgraded, within funding constraints imposed by market driven vessel conversion cost increases, to include: (1) enhanced shipboard analytical services including instruments and software for sampling and the capture of chemistry, physical properties, and geological data; (2) new data management capabilities built around a laboratory information management system (LIMS), digital asset management system, and web services; (3) operations data services with enhanced access to navigation and rig instrumentation data; and (4) a combination of commercial and home-made user applications for workflow- specific data extractions, generic and customized data reporting, and data visualization within a shipboard production environment. The instrumented data capture systems include a new set of core loggers for rapid and non-destructive acquisition of images and other physical properties data from drill cores. Line-scan imaging and natural gamma ray loggers capture data at unprecedented quality due to new and innovative designs. Many instruments used to characterize chemical compounds of rocks, sediments, and interstitial fluids were upgraded with the latest technology. The shipboard analytical environment features a new and innovative framework (DESCinfo) and application (DESClogik) for capturing descriptive and interpretive data from geological sub-domains such as sedimentology, petrology, paleontology, structural geology, stratigraphy, etc. This system fills a long-standing gap by providing a global database, controlled vocabularies and taxa name lists with version control, a highly configurable spreadsheet environment for data capture, and visualization of context data collected with the shipboard core loggers and other instruments.
Oldenburg, Curtis M. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Earth Sciences Division
2018-05-07
Summer Lecture Series 2009: Climate change provides strong motivation to reduce CO2 emissions from the burning of fossil fuels. Carbon dioxide capture and storage involves the capture, compression, and transport of CO2 to geologically favorable areas, where its injected into porous rock more than one kilometer underground for permanent storage. Oldenburg, who heads Berkeley Labs Geologic Carbon Sequestration Program, will focus on the challenges, opportunities, and research needs of this innovative technology.
Applied research of embedded WiFi technology in the motion capture system
NASA Astrophysics Data System (ADS)
Gui, Haixia
2012-04-01
Embedded wireless WiFi technology is one of the current wireless hot spots in network applications. This paper firstly introduces the definition and characteristics of WiFi. With the advantages of WiFi such as using no wiring, simple operation and stable transmission, this paper then gives a system design for the application of embedded wireless WiFi technology in the motion capture system. Also, it verifies the effectiveness of design in the WiFi-based wireless sensor hardware and software program.
A new dawn for industrial photosynthesis.
Robertson, Dan E; Jacobson, Stuart A; Morgan, Frederick; Berry, David; Church, George M; Afeyan, Noubar B
2011-03-01
Several emerging technologies are aiming to meet renewable fuel standards, mitigate greenhouse gas emissions, and provide viable alternatives to fossil fuels. Direct conversion of solar energy into fungible liquid fuel is a particularly attractive option, though conversion of that energy on an industrial scale depends on the efficiency of its capture and conversion. Large-scale programs have been undertaken in the recent past that used solar energy to grow innately oil-producing algae for biomass processing to biodiesel fuel. These efforts were ultimately deemed to be uneconomical because the costs of culturing, harvesting, and processing of algal biomass were not balanced by the process efficiencies for solar photon capture and conversion. This analysis addresses solar capture and conversion efficiencies and introduces a unique systems approach, enabled by advances in strain engineering, photobioreactor design, and a process that contradicts prejudicial opinions about the viability of industrial photosynthesis. We calculate efficiencies for this direct, continuous solar process based on common boundary conditions, empirical measurements and validated assumptions wherein genetically engineered cyanobacteria convert industrially sourced, high-concentration CO(2) into secreted, fungible hydrocarbon products in a continuous process. These innovations are projected to operate at areal productivities far exceeding those based on accumulation and refining of plant or algal biomass or on prior assumptions of photosynthetic productivity. This concept, currently enabled for production of ethanol and alkane diesel fuel molecules, and operating at pilot scale, establishes a new paradigm for high productivity manufacturing of nonfossil-derived fuels and chemicals.
Liu, Yang; Hoppe, Brenda O; Convertino, Matteo
2018-04-10
Emergency risk communication (ERC) programs that activate when the ambient temperature is expected to cross certain extreme thresholds are widely used to manage relevant public health risks. In practice, however, the effectiveness of these thresholds has rarely been examined. The goal of this study is to test if the activation criteria based on extreme temperature thresholds, both cold and heat, capture elevated health risks for all-cause and cause-specific mortality and morbidity in the Minneapolis-St. Paul Metropolitan Area. A distributed lag nonlinear model (DLNM) combined with a quasi-Poisson generalized linear model is used to derive the exposure-response functions between daily maximum heat index and mortality (1998-2014) and morbidity (emergency department visits; 2007-2014). Specific causes considered include cardiovascular, respiratory, renal diseases, and diabetes. Six extreme temperature thresholds, corresponding to 1st-3rd and 97th-99th percentiles of local exposure history, are examined. All six extreme temperature thresholds capture significantly increased relative risks for all-cause mortality and morbidity. However, the cause-specific analyses reveal heterogeneity. Extreme cold thresholds capture increased mortality and morbidity risks for cardiovascular and respiratory diseases and extreme heat thresholds for renal disease. Percentile-based extreme temperature thresholds are appropriate for initiating ERC targeting the general population. Tailoring ERC by specific causes may protect some but not all individuals with health conditions exacerbated by hazardous ambient temperature exposure. © 2018 Society for Risk Analysis.
Hilton, Lara; Elfenbaum, Pamela; Jain, Shamini; Sprengel, Meredith; Jonas, Wayne B.
2016-01-01
Background: The evaluation of freestanding integrative cancer clinical programs is challenging and is rarely done. We have developed an approach called the Claim Assessment Profile (CAP) to identify whether evaluation of a practice is justified, feasible, and likely to provide useful information. Objectives: A CAP was performed in order to (1) clarify the healing claims at InspireHealth, an integrative oncology treatment program, by defining the most important impacts on its clients; (2) gather information about current research capacity at the clinic; and (3) create a program theory and path model for use in prospective research. Study Design/Methods: This case study design incorporates methods from a variety of rapid assessment approaches. Procedures included site visits to observe the program, structured qualitative interviews with 26 providers and staff, surveys to capture descriptive data about the program, and observational data on program implementation. Results: The InspireHealth program is a well-established, multi-site, thriving integrative oncology clinical practice that focuses on patient support, motivation, and health behavior engagement. It delivers patient-centered care via a standardized treatment protocol. There arehigh levels of research interest from staff and resources by which to conduct research. Conclusions: This analysis provides the primary descriptive and claims clarification of an integrative oncology treatment program, an evaluation readiness report, a detailed logic model explicating program theory, and a clinical outcomes path model for conducting prospective research. Prospective evaluation of this program would be feasible and valuable, adding to our knowledge base of integrative cancer therapies. PMID:29444602
VIV analysis of pipelines under complex span conditions
NASA Astrophysics Data System (ADS)
Wang, James; Steven Wang, F.; Duan, Gang; Jukes, Paul
2009-06-01
Spans occur when a pipeline is laid on a rough undulating seabed or when upheaval buckling occurs due to constrained thermal expansion. This not only results in static and dynamic loads on the flowline at span sections, but also generates vortex induced vibration (VIV), which can lead to fatigue issues. The phenomenon, if not predicted and controlled properly, will negatively affect pipeline integrity, leading to expensive remediation and intervention work. Span analysis can be complicated by: long span lengths, a large number of spans caused by a rough seabed, and multi-span interactions. In addition, the complexity can be more onerous and challenging when soil uncertainty, concrete degradation and unknown residual lay tension are considered in the analysis. This paper describes the latest developments and a ‘state-of-the-art’ finite element analysis program that has been developed to simulate the span response of a flowline under complex boundary and loading conditions. Both VIV and direct wave loading are captured in the analysis and the results are sequentially used for the ultimate limit state (ULS) check and fatigue life calculation.
The effects of water depth on prey detection and capture by juvenile coho salmon and steelhead
J.J. Piccolo; N.F. Hughes; M.D. Bryant
2007-01-01
We used three-dimensional video analysis of feeding experiments to determine the effects of water depth on prey detection and capture by drift-feeding juvenile coho salmon (Oncorhynchus kisutch) and steelhead (0. mykiss irideus). Depth treatments were 0.15, 0.30, 0.45 and 0.60 m. Mean prey capture probabilities for both species...
Byeon, Ji-Yeon; Bailey, Ryan C
2011-09-07
High affinity capture agents recognizing biomolecular targets are essential in the performance of many proteomic detection methods. Herein, we report the application of a label-free silicon photonic biomolecular analysis platform for simultaneously determining kinetic association and dissociation constants for two representative protein capture agents: a thrombin-binding DNA aptamer and an anti-thrombin monoclonal antibody. The scalability and inherent multiplexing capability of the technology make it an attractive platform for simultaneously evaluating the binding characteristics of multiple capture agents recognizing the same target antigen, and thus a tool complementary to emerging high-throughput capture agent generation strategies.
Biomechanical analysis using Kinovea for sports application
NASA Astrophysics Data System (ADS)
Muaza Nor Adnan, Nor; Patar, Mohd Nor Azmi Ab; Lee, Hokyoo; Yamamoto, Shin-Ichiroh; Jong-Young, Lee; Mahmud, Jamaluddin
2018-04-01
This paper assesses the reliability of HD VideoCam–Kinovea as an alternative tool in conducting motion analysis and measuring knee relative angle of drop jump movement. The motion capture and analysis procedure were conducted in the Biomechanics Lab, Shibaura Institute of Technology, Omiya Campus, Japan. A healthy subject without any gait disorder (BMI of 28.60 ± 1.40) was recruited. The volunteered subject was asked to per the drop jump movement on preset platform and the motion was simultaneously recorded using an established infrared motion capture system (Hawk–Cortex) and a HD VideoCam in the sagittal plane only. The capture was repeated for 5 times. The outputs (video recordings) from the HD VideoCam were input into Kinovea (an open-source software) and the drop jump pattern was tracked and analysed. These data are compared with the drop jump pattern tracked and analysed earlier using the Hawk–Cortex system. In general, the results obtained (drop jump pattern) using the HD VideoCam–Kinovea are close to the results obtained using the established motion capture system. Basic statistical analyses show that most average variances are less than 10%, thus proving the repeatability of the protocol and the reliability of the results. It can be concluded that the integration of HD VideoCam–Kinovea has the potential to become a reliable motion capture–analysis system. Moreover, it is low cost, portable and easy to use. As a conclusion, the current study and its findings are found useful and has contributed to enhance significant knowledge pertaining to motion capture-analysis, drop jump movement and HD VideoCam–Kinovea integration.
Russ, Alissa L; Militello, Laura G; Glassman, Peter A; Arthur, Karen J; Zillich, Alan J; Weiner, Michael
2017-05-03
Cognitive task analysis (CTA) can yield valuable insights into healthcare professionals' cognition and inform system design to promote safe, quality care. Our objective was to adapt CTA-the critical decision method, specifically-to investigate patient safety incidents, overcome barriers to implementing this method, and facilitate more widespread use of cognitive task analysis in healthcare. We adapted CTA to facilitate recruitment of healthcare professionals and developed a data collection tool to capture incidents as they occurred. We also leveraged the electronic health record (EHR) to expand data capture and used EHR-stimulated recall to aid reconstruction of safety incidents. We investigated 3 categories of medication-related incidents: adverse drug reactions, drug-drug interactions, and drug-disease interactions. Healthcare professionals submitted incidents, and a subset of incidents was selected for CTA. We analyzed several outcomes to characterize incident capture and completed CTA interviews. We captured 101 incidents. Eighty incidents (79%) met eligibility criteria. We completed 60 CTA interviews, 20 for each incident category. Capturing incidents before interviews allowed us to shorten the interview duration and reduced reliance on healthcare professionals' recall. Incorporating the EHR into CTA enriched data collection. The adapted CTA technique was successful in capturing specific categories of safety incidents. Our approach may be especially useful for investigating safety incidents that healthcare professionals "fix and forget." Our innovations to CTA are expected to expand the application of this method in healthcare and inform a wide range of studies on clinical decision making and patient safety.
Rate-dependent Loss of Capture during Ventricular Pacing.
Wang, Jingfeng; Chen, Haiyan; Su, Yangang; Ge, Junbo
2015-01-01
A 63-year-old patient who had undergone atrial septal defect surgical repair received implantation of a single chamber VVI pacemaker for long RR intervals during atrial fibrillation. One week later, an intermittent loss of capture and sensing failure was detected at a pacing rate of 70 beats/min. However, a successful capture was observed during rapid pacing. Consequently, the pacing rate was temporarily adjusted to 90 beats/min. At the 3-month follow-up, the pacemaker was shown to be functioning properly independent of the pacing rate. An echocardiogram showed that the increased pacing rates were accompanied by a reduction in the right ventricular outflow tract dimension. The pacemaker was then permanently programmed at a lower rate of 60 beats/min.
Guidance, Navigation, and Control Techniques and Technologies for Active Satellite Removal
NASA Astrophysics Data System (ADS)
Ortega Hernando, Guillermo; Erb, Sven; Cropp, Alexander; Voirin, Thomas; Dubois-Matra, Olivier; Rinalducci, Antonio; Visentin, Gianfranco; Innocenti, Luisa; Raposo, Ana
2013-09-01
This paper shows an internal feasibility analysis to de- orbit a non-functional satellite of big dimensions by the Technical Directorate of the European Space Agency ESA. The paper focuses specifically on the design of the techniques and technologies for the Guidance, Navigation, and Control (GNC) system of the spacecraft mission that will capture the satellite and ultimately will de-orbit it on a controlled re-entry.The paper explains the guidance strategies to launch, rendezvous, close-approach, and capture the target satellite. The guidance strategy uses chaser manoeuvres, hold points, and collision avoidance trajectories to ensure a safe capture. It also details the guidance profile to de-orbit it in a controlled re-entry.The paper continues with an analysis of the required sensing suite and the navigation algorithms to allow the homing, fly-around, and capture of the target satellite. The emphasis is placed around the design of a system to allow the rendezvous with an un-cooperative target, including the autonomous acquisition of both the orbital elements and the attitude of the target satellite.Analysing the capture phase, the paper provides a trade- off between two selected capture systems: the net and the tentacles. Both are studied from the point of view of the GNC system.The paper analyses as well the advanced algorithms proposed to control the final compound after the capture that will allow the controlled de-orbiting of the assembly in a safe place in the Earth.The paper ends proposing the continuation of this work with the extension to the analysis of the destruction process of the compound in consecutive segments starting from the entry gate to the rupture and break up.
Full-motion video analysis for improved gender classification
NASA Astrophysics Data System (ADS)
Flora, Jeffrey B.; Lochtefeld, Darrell F.; Iftekharuddin, Khan M.
2014-06-01
The ability of computer systems to perform gender classification using the dynamic motion of the human subject has important applications in medicine, human factors, and human-computer interface systems. Previous works in motion analysis have used data from sensors (including gyroscopes, accelerometers, and force plates), radar signatures, and video. However, full-motion video, motion capture, range data provides a higher resolution time and spatial dataset for the analysis of dynamic motion. Works using motion capture data have been limited by small datasets in a controlled environment. In this paper, we explore machine learning techniques to a new dataset that has a larger number of subjects. Additionally, these subjects move unrestricted through a capture volume, representing a more realistic, less controlled environment. We conclude that existing linear classification methods are insufficient for the gender classification for larger dataset captured in relatively uncontrolled environment. A method based on a nonlinear support vector machine classifier is proposed to obtain gender classification for the larger dataset. In experimental testing with a dataset consisting of 98 trials (49 subjects, 2 trials per subject), classification rates using leave-one-out cross-validation are improved from 73% using linear discriminant analysis to 88% using the nonlinear support vector machine classifier.
NASA Technical Reports Server (NTRS)
Follen, Gregory J.; Naiman, Cynthia G.
1999-01-01
The NASA Lewis Research Center is developing an environment for analyzing and designing aircraft engines-the Numerical Propulsion System Simulation (NPSS). NPSS will integrate multiple disciplines, such as aerodynamics, structure, and heat transfer, and will make use of numerical "zooming" on component codes. Zooming is the coupling of analyses at various levels of detail. NPSS uses the latest computing and communication technologies to capture complex physical processes in a timely, cost-effective manner. The vision of NPSS is to create a "numerical test cell" enabling full engine simulations overnight on cost-effective computing platforms. Through the NASA/Industry Cooperative Effort agreement, NASA Lewis and industry partners are developing a new engine simulation called the National Cycle Program (NCP). NCP, which is the first step toward NPSS and is its initial framework, supports the aerothermodynamic system simulation process for the full life cycle of an engine. U.S. aircraft and airframe companies recognize NCP as the future industry standard common analysis tool for aeropropulsion system modeling. The estimated potential payoff for NCP is a $50 million/yr savings to industry through improved engineering productivity.
Exploring Mission Concepts with the JPL Innovation Foundry A-Team
NASA Technical Reports Server (NTRS)
Ziemer, John K.; Ervin, Joan; Lang, Jared
2013-01-01
The JPL Innovation Foundry has established a new approach for exploring, developing, and evaluating early concepts called the A-Team. The A-Team combines innovative collaborative methods with subject matter expertise and analysis tools to help mature mission concepts. Science, implementation, and programmatic elements are all considered during an A-Team study. Methods are grouped by Concept Maturity Level (CML), from 1 through 3, including idea generation and capture (CML 1), initial feasibility assessment (CML 2), and trade space exploration (CML 3). Methods used for each CML are presented, and the key team roles are described from two points of view: innovative methods and technical expertise. A-Team roles for providing innovative methods include the facilitator, study lead, and assistant study lead. A-Team roles for providing technical expertise include the architect, lead systems engineer, and integration engineer. In addition to these key roles, each A-Team study is uniquely staffed to match the study topic and scope including subject matter experts, scientists, technologists, flight and instrument systems engineers, and program managers as needed. Advanced analysis and collaborative engineering tools (e.g. cost, science traceability, mission design, knowledge capture, study and analysis support infrastructure) are also under development for use in A-Team studies and will be discussed briefly. The A-Team facilities provide a constructive environment for innovative ideas from all aspects of mission formulation to eliminate isolated studies and come together early in the development cycle when they can provide the biggest impact. This paper provides an overview of the A-Team, its study processes, roles, methods, tools and facilities.
Perry, Russell W.; Kirsch, Joseph E.; Hendrix, A. Noble
2016-06-17
Resource managers rely on abundance or density metrics derived from beach seine surveys to make vital decisions that affect fish population dynamics and assemblage structure. However, abundance and density metrics may be biased by imperfect capture and lack of geographic closure during sampling. Currently, there is considerable uncertainty about the capture efficiency of juvenile Chinook salmon (Oncorhynchus tshawytscha) by beach seines. Heterogeneity in capture can occur through unrealistic assumptions of closure and from variation in the probability of capture caused by environmental conditions. We evaluated the assumptions of closure and the influence of environmental conditions on capture efficiency and abundance estimates of Chinook salmon from beach seining within the Sacramento–San Joaquin Delta and the San Francisco Bay. Beach seine capture efficiency was measured using a stratified random sampling design combined with open and closed replicate depletion sampling. A total of 56 samples were collected during the spring of 2014. To assess variability in capture probability and the absolute abundance of juvenile Chinook salmon, beach seine capture efficiency data were fitted to the paired depletion design using modified N-mixture models. These models allowed us to explicitly test the closure assumption and estimate environmental effects on the probability of capture. We determined that our updated method allowing for lack of closure between depletion samples drastically outperformed traditional data analysis that assumes closure among replicate samples. The best-fit model (lowest-valued Akaike Information Criterion model) included the probability of fish being available for capture (relaxed closure assumption), capture probability modeled as a function of water velocity and percent coverage of fine sediment, and abundance modeled as a function of sample area, temperature, and water velocity. Given that beach seining is a ubiquitous sampling technique for many species, our improved sampling design and analysis could provide significant improvements in density and abundance estimation.
NASA Astrophysics Data System (ADS)
Horvat, Vladimir
2009-06-01
ERCS08 is a program for computing the atomic electron removal cross sections. It is written in FORTRAN in order to make it more portable and easier to customize by a large community of physicists, but it also comes with a separate windows graphics user interface control application ERCS08w that makes it easy to quickly prepare the input file, run the program, as well as view and analyze the output. The calculations are based on the ECPSSR theory for direct (Coulomb) ionization and non-radiative electron capture. With versatility in mind, the program allows for selective inclusion or exclusion of individual contributions to the cross sections from effects such as projectile energy loss, Coulomb deflection of the projectile, perturbation of electron's stationary state (polarization and binding), as well as relativity. This makes it straightforward to assess the importance of each effect in a given collision regime. The control application also makes it easy to setup for calculations in inverse kinematics (i.e. ionization of projectile ions by target atoms or ions). Program summaryProgram title: ERCS08 Catalogue identifier: AECU_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AECU_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 12 832 No. of bytes in distributed program, including test data, etc.: 318 420 Distribution format: tar.gz Programming language: Once the input file is prepared (using a text editor or ERCS08w), all the calculations are done in FORTRAN using double precision. Computer: see "Operating system" below Operating system: The main program (ERCS08) can run on any computer equipped with a FORTRAN compiler. Its pre-compiled executable file (supplied) runs under DOS or Windows. The supplied graphics user interface control application (ERCS08w) requires a Windows operating system. ERCS08w is designed to be used along with a text editor. Any editor can be used, including the one that comes with the operating system (for example, Edit for DOS or Notepad for Windows). Classification: 16.7, 16.8 Nature of problem: ECPSSR has become a typical tag word for a theory that goes beyond the standard plane wave Born approximation (PWBA) in order to predict the cross sections for direct (Coulomb) ionization of atomic electrons by projectile ions, taking into account the energy loss (E) and Coulomb deflection (C) of the projectile, as well as the perturbed stationary state (PSS) and relativistic nature (R) of the target electron. Its treatment of non-radiative electron capture to the projectile goes beyond the Oppenheimer-Brinkman-Kramers approximation (OBK) to include the effects of C, PSS, and R. PSS is described in terms of increased target electron binding (B) due to the presence of the projectile in the vicinity of the target nucleus, and (for direct ionization only) polarization of the target electron cloud (P) while projectile is outside the electron's shell radius. Several modifications of the theory have been recently suggested or endorsed by one of its authors (Lapicki). These modifications are sometimes explicit in the tag word (for example, eCPSSR, eCUSR, ReCPSShsR, etc.) A cross section for the ionization of a target electron is assumed to equal the sum of the cross sections for direct ionization (DI) and electron capture (EC). Solution method: The calculations are based on the ECPSSR theory for direct (Coulomb) ionization and non-radiative electron capture. With versatility in mind, the program allows for selective inclusion or exclusion of individual contributions to the cross sections from effects such as projectile energy loss, Coulomb deflection of the projectile, perturbation of electron's stationary state (polarization and binding), as well as relativity. This makes it straightforward to assess the importance of each effect in a given collision regime. The control application also makes it easy to setup for calculations in inverse kinematics (i.e. ionization of projectile ions by target atoms or ions). Restrictions: The program is restricted to the ionization of K, L, and M electrons. The theory is non-relativistic, which effectively limits its applicability to projectile energies up to about 50 MeV/amu. However, the theory is extended to apply to relativistic light projectiles. Radiative electron capture is not taken into account, since its contribution is found to be negligible in the collision regimes covered by the ECPSSR theory. Unusual features: Windows graphics user interface along with a FORTRAN code for calculations, selective inclusion or exclusion of specific corrections, inclusion of the extension to relativistic light projectiles, inclusion of non-radiative electron capture. Running time: Running the program using the input data provided with the distribution only takes a few seconds.
NASA Astrophysics Data System (ADS)
Kuehl, C. Stephen
2003-08-01
Completing its final development and early deployment on the Navy's multi-role aircraft, the F/A-18 E/F Super Hornet, the SHAred Reconnaissance Pod (SHARP) provides the war fighter with the latest digital tactical reconnaissance (TAC Recce) Electro-Optical/Infrared (EO/IR) sensor system. The SHARP program is an evolutionary acquisition that used a spiral development process across a prototype development phase tightly coupled into overlapping Engineering and Manufacturing Development (EMD) and Low Rate Initial Production (LRIP) phases. Under a tight budget environment with a highly compressed schedule, SHARP challenged traditional acquisition strategies and systems engineering (SE) processes. Adopting tailored state-of-the-art systems engineering process models allowd the SHARP program to overcome the technical knowledge transition challenges imposed by a compressed program schedule. The program's original goal was the deployment of digital TAC Recce mission capabilities to the fleet customer by summer of 2003. Hardware and software integration technical challenges resulted from requirements definition and analysis activities performed across a government-industry led Integrated Product Team (IPT) involving Navy engineering and test sites, Boeing, and RTSC-EPS (with its subcontracted hardware and government furnished equipment vendors). Requirements development from a bottoms-up approach was adopted using an electronic requirements capture environment to clarify and establish the SHARP EMD product baseline specifications as relevant technical data became available. Applying Earned-Value Management (EVM) against an Integrated Master Schedule (IMS) resulted in efficiently managing SE task assignments and product deliveries in a dynamically evolving customer requirements environment. Application of Six Sigma improvement methodologies resulted in the uncovering of root causes of errors in wiring interconnectivity drawings, pod manufacturing processes, and avionics requirements specifications. Utilizing the draft NAVAIR SE guideline handbook and the ANSI/EIA-632 standard: Processes for Engineering a System, a systems engineering tailored process approach was adopted for the accelerated SHARP EMD prgram. Tailoring SE processes in this accelerated product delivery environment provided unique opportunities to be technically creative in the establishment of a product performance baseline. This paper provides an historical overview of the systems engineering activities spanning the prototype phase through the EMD SHARP program phase, the performance requirement capture activities and refinement process challenges, and what SE process improvements can be applied to future SHARP-like programs adopting a compressed, evolutionary spiral development acquisition paradigm.
2007-11-01
Engineer- ing Research Laboratory is currently developing a set of facility ‘architec- tural’ programming tools , called Facility ComposerTM (FC). FC...requirements in the early phases of project development. As the facility program, crite- ria, and requirements are chosen, these tools populate the IFC...developing a set of facility “ar- chitectural” programming tools , called Facility Composer (FC), to support the capture and tracking of facility criteria
Cosmic Radiation Detection and Observations
NASA Astrophysics Data System (ADS)
Ramirez Chavez, Juan; Troncoso, Maria
Cosmic rays consist of high-energy particles accelerated from remote supernova remnant explosions and travel vast distances throughout the universe. Upon arriving at earth, the majority of these particles ionize gases in the upper atmosphere, while others interact with gas molecules in the troposphere and producing secondary cosmic rays, which are the main focus of this research. To observe these secondary cosmic rays, a detector telescope was designed and equipped with two silicon photomultipliers (SiPMs). Each SiPM is coupled to a bundle of 4 wavelength shifting optical fibers that are embedded inside a plastic scintillator sheet. The SiPM signals were amplified using a fast preamplifier with coincidence between detectors established using a binary logic gate. The coincidence events were recorded with two devices; a digital counter and an Arduino micro-controller. For detailed analysis of the SiPM waveforms, a DRS4 sensory digitizer captured the waveforms for offline analysis with the CERN software package Physics Analysis Workstation in a Linux environment. Results from our experiments would be presented. Hartnell College STEM Internship Program.
Structural Weight Estimation for Launch Vehicles
NASA Technical Reports Server (NTRS)
Cerro, Jeff; Martinovic, Zoran; Su, Philip; Eldred, Lloyd
2002-01-01
This paper describes some of the work in progress to develop automated structural weight estimation procedures within the Vehicle Analysis Branch (VAB) of the NASA Langley Research Center. One task of the VAB is to perform system studies at the conceptual and early preliminary design stages on launch vehicles and in-space transportation systems. Some examples of these studies for Earth to Orbit (ETO) systems are the Future Space Transportation System [1], Orbit On Demand Vehicle [2], Venture Star [3], and the Personnel Rescue Vehicle[4]. Structural weight calculation for launch vehicle studies can exist on several levels of fidelity. Typically historically based weight equations are used in a vehicle sizing program. Many of the studies in the vehicle analysis branch have been enhanced in terms of structural weight fraction prediction by utilizing some level of off-line structural analysis to incorporate material property, load intensity, and configuration effects which may not be captured by the historical weight equations. Modification of Mass Estimating Relationships (MER's) to assess design and technology impacts on vehicle performance are necessary to prioritize design and technology development decisions. Modern CAD/CAE software, ever increasing computational power and platform independent computer programming languages such as JAVA provide new means to create greater depth of analysis tools which can be included into the conceptual design phase of launch vehicle development. Commercial framework computing environments provide easy to program techniques which coordinate and implement the flow of data in a distributed heterogeneous computing environment. It is the intent of this paper to present a process in development at NASA LaRC for enhanced structural weight estimation using this state of the art computational power.
Perlman, Michal; Fletcher, Brooke; Falenchuk, Olesya; Brunsek, Ashley; McMullen, Evelyn; Shah, Prakesh S
2017-01-01
Child-staff ratios are a key quality indicator in early childhood education and care (ECEC) programs. Better ratios are believed to improve child outcomes by increasing opportunities for individual interactions and educational instruction from staff. The purpose of this systematic review, and where possible, meta-analysis, was to evaluate the association between child-staff ratios in preschool ECEC programs and children's outcomes. Searches of Medline, PsycINFO, ERIC, websites of large datasets and reference sections of all retrieved articles were conducted up to July 3, 2015. Cross-sectional or longitudinal studies that evaluated the relationship between child-staff ratios in ECEC classrooms serving preschool aged children and child outcomes were independently identified by two reviewers. Data were independently extracted from included studies by two raters and differences between raters were resolved by consensus. Searches revealed 29 eligible studies (31 samples). Child-staff ratios ranged from 5 to 14.5 preschool-aged children per adult with a mean of 8.65. All 29 studies were included in the systematic review. However, the only meta-analysis that could be conducted was based on three studies that explored associations between ratios and children's receptive language. Results of this meta-analysis were not significant. Results of the qualitative systematic review revealed few significant relationships between child-staff ratios and child outcomes construed broadly. Thus, the available literature reveal few, if any, relationships between child-staff ratios in preschool ECEC programs and children's developmental outcomes. Substantial heterogeneity in the assessment of ratios, outcomes measured, and statistics used to capture associations limited quantitative synthesis. Other methodological limitations of the research integrated in this synthesis are discussed.
Perlman, Michal; Fletcher, Brooke; Falenchuk, Olesya; Brunsek, Ashley; McMullen, Evelyn; Shah, Prakesh S.
2017-01-01
Child-staff ratios are a key quality indicator in early childhood education and care (ECEC) programs. Better ratios are believed to improve child outcomes by increasing opportunities for individual interactions and educational instruction from staff. The purpose of this systematic review, and where possible, meta-analysis, was to evaluate the association between child-staff ratios in preschool ECEC programs and children’s outcomes. Searches of Medline, PsycINFO, ERIC, websites of large datasets and reference sections of all retrieved articles were conducted up to July 3, 2015. Cross-sectional or longitudinal studies that evaluated the relationship between child-staff ratios in ECEC classrooms serving preschool aged children and child outcomes were independently identified by two reviewers. Data were independently extracted from included studies by two raters and differences between raters were resolved by consensus. Searches revealed 29 eligible studies (31 samples). Child-staff ratios ranged from 5 to 14.5 preschool-aged children per adult with a mean of 8.65. All 29 studies were included in the systematic review. However, the only meta-analysis that could be conducted was based on three studies that explored associations between ratios and children’s receptive language. Results of this meta-analysis were not significant. Results of the qualitative systematic review revealed few significant relationships between child-staff ratios and child outcomes construed broadly. Thus, the available literature reveal few, if any, relationships between child-staff ratios in preschool ECEC programs and children’s developmental outcomes. Substantial heterogeneity in the assessment of ratios, outcomes measured, and statistics used to capture associations limited quantitative synthesis. Other methodological limitations of the research integrated in this synthesis are discussed. PMID:28103288
van der Riet, Pamela; Rossiter, Rachel; Kirby, Dianne; Dluzewska, Teresa; Harmon, Charles
2015-01-01
Widespread reports of high stress levels and mental health problems among university student populations indicate the use of interventions to facilitate stress reduction and support student resilience and wellbeing. There is growing evidence that regular mindfulness practice may confer positive health benefits and reduced stress levels. The aim of this pilot project was to explore the impact of a seven-week stress management and mindfulness program as a learning support and stress reduction method for nursing and midwifery students. The program was conducted at a large regional university in Australia. Fourteen first-year undergraduate nursing and midwifery students agreed to attend the program and to participate in a follow-up focus group. A descriptive qualitative design was utilised to examine the impact of the program. A semi-structured focus group interview was conducted with a thematic analysis undertaken of the transcript and process notes. Ten students completed the research component of this project by participating in the focus group interview. Three main themes capture the participants' experience: attending to self, attending to others and attending to program related challenges. Data indicate a positive impact on sleep, concentration, clarity of thought and a reduction in negative cognitions. Participants also identified challenges related to timetabling, program structure and venue. Overall, this pilot program enhanced the participants' sense of well-being. Despite the challenges, benefits were identified on a personal and professional level. Valuable feedback was provided that will be used to further develop and expand stress management and mindfulness programs offered to students attending this university. Copyright © 2014. Published by Elsevier Ltd.
Snake River Sockeye Salmon Captive Broodstock Program; Research Element, 2002 Annual Report.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Willard, Catherine; Hebdon, J. Lance; Castillo, Jason
2004-06-01
On November 20, 1991, the National Oceanic Atmospheric Administration listed Snake River sockeye salmon Oncorhynchus nerka as endangered under the Endangered Species Act of 1973. In 1991, the Shoshone-Bannock Tribes and Idaho Department of Fish and Game initiated the Snake River Sockeye Salmon Sawtooth Valley Project to conserve and rebuild populations in Idaho. Restoration efforts are focusing on Redfish, Pettit, and Alturas lakes within the Sawtooth Valley. The first release of hatchery-produced juvenile sockeye salmon from the captive broodstock program occurred in 1994. The first anadromous adult returns from the captive broodstock program were recorded in 1999 when six jacksmore » and one jill were captured at IDFG's Sawtooth Fish Hatchery. In 2002, progeny from the captive broodstock program were released using four strategies: age-0 presmolts were released to Alturas, Pettit, and Redfish lakes in August and to Pettit and Redfish lakes in October, age-1 smolts were released to Redfish Lake Creek in May, eyed-eggs were planted in Pettit Lake in December, and hatchery-produced and anadromous adult sockeye salmon were released to Redfish Lake for volitional spawning in September. Oncorhynchus nerka population monitoring was conducted on Redfish, Alturas, and Pettit lakes using a midwater trawl in September 2002. Age-0, age-1, and age-2 O. nerka were captured in Redfish Lake, and population abundance was estimated at 50,204 fish. Age-0, age-1, age-2, and age-3 kokanee were captured in Alturas Lake, and population abundance was estimated at 24,374 fish. Age-2 and age-3 O. nerka were captured in Pettit Lake, and population abundance was estimated at 18,328 fish. The ultimate goal of the Idaho Department of Fish and Game (IDFG) captive broodstock development and evaluation efforts is to recover sockeye salmon runs in Idaho waters. Recovery is defined as reestablishing sockeye salmon runs and providing for utilization of sockeye salmon and kokanee resources by anglers. The immediate project goal is to maintain this unique sockeye salmon population through captive broodstock technology and avoid species extinction. The project objectives are: (1) Develop captive broodstocks from Redfish Lake anadromous sockeye salmon. (2) Determine the contribution hatchery-produced sockeye salmon make toward avoiding population extinction and increasing population abundance. (3) Describe O. nerka population characteristics for Sawtooth Valley lakes in relation to carrying capacity and broodstock program supplementation efforts. (4) Refine our ability to discern the origin of wild and broodstock sockeye salmon to provide maximum effectiveness in their utilization within the broodstock program. (5) Transfer technology through participation in the technical oversight committee process, providing written activity reports and participation in essential program management and planning activities.« less
High Dynamic Range Digital Imaging of Spacecraft
NASA Technical Reports Server (NTRS)
Karr, Brian A.; Chalmers, Alan; Debattista, Kurt
2014-01-01
The ability to capture engineering imagery with a wide degree of dynamic range during rocket launches is critical for post launch processing and analysis [USC03, NNC86]. Rocket launches often present an extreme range of lightness, particularly during night launches. Night launches present a two-fold problem: capturing detail of the vehicle and scene that is masked by darkness, while also capturing detail in the engine plume.
John J. Piccolo; Nicholas F. Hughes; Mason D. Bryant
2008-01-01
We examined the effects of water velocity on prey detection and capture by drift-feeding juvenile coho salmon (Oncorhynchus kisutch) and steelhead (sea-run rainbow trout,Oncorhynchus mykiss irideus) in laboratory experiments. We used repeated-measures analysis of variance to test the effects of velocity, species, and the velocity x species interaction on prey capture...
[Specificity of the Adultrap for capturing females of Aedes aegypti (Diptera: Culicidae)].
Gomes, Almério de Castro; da Silva, Nilza Nunes; Bernal, Regina Tomie Ivata; Leandro, André de Souza; de Camargo, Natal Jataí; da Silva, Allan Martins; Ferreira, Adão Celestino; Ogura, Luis Carlos; de Oliveira, Sebastião José; de Moura, Silvestre Marques
2007-01-01
The Adultrap is a new trap built for capturing females of Aedes aegypti. Tests were carried out to evaluate the specificity of this trap in comparison with the technique of aspiration of specimens in artificial shelters. Adultraps were kept for 24 hours inside and outside 120 randomly selected homes in two districts of the city of Foz do Iguaçú, State of Paraná. The statistical test was Poissons log-linear model. The result was 726 mosquitoes captured, of which 80 were Aedes aegypti. The Adultrap captured only females of this species, while the aspiration method captured both sexes of Aedes aegypti and another five species. The Adultrap captured Aedes aegypti inside and outside the homes, but the analysis indicated that, outside the homes, this trap captured significantly more females than aspiration did. The sensitivity of the Adultrap for detecting females of Aedes aegypti in low-frequency situations was also demonstrated.
FUSE Observations of Neutron-Capture Elements in Wolf-Rayet Planetary Nebulae
NASA Astrophysics Data System (ADS)
Dinerstein, H.
We propose to obtain FUSE observations of planetary nebula central stars of the WC Wolf-Rayet ([WC]) class, in order to search for the products of neutron-capture processes in these stars and provide constraints on their evolutionary status. Although the origin of the [WC]'s is controversial, their H-deficient, C-rich surface compositions indicate that they have experienced a high degree of mixing and/or mass loss. Thus one might expect the nebulae they produce to show enhanced concentrations of He-burning and other nuclear products, such as nuclei produced by slow neutron capture during the AGB phase. We have already detected an absorption line from one such element, Germanium (Sterling, Dinerstein, & Bowers 2002), while conducting a search for H2 absorption from nebular molecular material FUSE GI programs A085 and B069). Since the strongest Ge enhancements were found in PNe with [WC] central stars, we propose to enlarge the sample of such objects observed by FUSE. THIS TEMPORARY AND PARTIAL SCRIPT COVERS ONE TARGET, HE 2-99, AND REQUESTS AN EXPOSURE TIME OF 15 KSEC. PHASE 2 INFORMATION FOR THE REMAINDER OF THE PROGRAM'S TOTAL TIME ALLOCATION OF 60 KSEC WILL BE SUBMITTED AT A LATER TIME.
Density functional calculations of multiphonon capture cross sections at defects in semiconductors
NASA Astrophysics Data System (ADS)
Barmparis, Georgios D.; Puzyrev, Yevgeniy S.; Zhang, X.-G.; Pantelides, Sokrates T.
2014-03-01
The theory of electron capture cross sections by multiphonon processes in semiconductors has a long and controversial history. Here we present a comprehensive theory and describe its implementation for realistic calculations. The Born-Oppenheimer and the Frank-Condon approximations are employed. The transition probability of an incoming electron is written as a product of an instantaneous electronic transition in the initial defect configuration and the line shape function (LSF) that describes the multiphonon processes that lead to lattice relaxation. The electronic matrix elements are calculated using the Projector Augmented Wave (PAW) method which yields the true wave functions while still employing a plane-wave basis. The LSF is calculated by employing a Monte Carlo method and the real phonon modes of the defect, calculated using density functional theory in the PAW scheme. Initial results of the capture cross section for a prototype system, namely a triply hydrogenated vacancy in Si are presented. The results are relevant for modeling device degradation by hot electron effects. This work is supported in part by the Samsung Advanced Institute of Technology (SAIT)'s Global Research Outreach (GRO) Program and by the LDRD program at ORNL.
The MSFC Solar Activity Future Estimation (MSAFE) Model
NASA Technical Reports Server (NTRS)
Suggs, Ron
2017-01-01
The MSAFE model provides forecasts for the solar indices SSN, F10.7, and Ap. These solar indices are used as inputs to space environment models used in orbital spacecraft operations and space mission analysis. Forecasts from the MSAFE model are provided on the MSFC Natural Environments Branch's solar web page and are updated as new monthly observations become available. The MSAFE prediction routine employs a statistical technique that calculates deviations of past solar cycles from the mean cycle and performs a regression analysis to calculate the deviation from the mean cycle of the solar index at the next future time interval. The forecasts are initiated for a given cycle after about 8 to 9 monthly observations from the start of the cycle are collected. A forecast made at the beginning of cycle 24 using the MSAFE program captured the cycle fairly well with some difficulty in discerning the double peak that occurred at solar cycle maximum.
Extracting Loop Bounds for WCET Analysis Using the Instrumentation Point Graph
NASA Astrophysics Data System (ADS)
Betts, A.; Bernat, G.
2009-05-01
Every calculation engine proposed in the literature of Worst-Case Execution Time (WCET) analysis requires upper bounds on loop iterations. Existing mechanisms to procure this information are either error prone, because they are gathered from the end-user, or limited in scope, because automatic analyses target very specific loop structures. In this paper, we present a technique that obtains bounds completely automatically for arbitrary loop structures. In particular, we show how to employ the Instrumentation Point Graph (IPG) to parse traces of execution (generated by an instrumented program) in order to extract bounds relative to any loop-nesting level. With this technique, therefore, non-rectangular dependencies between loops can be captured, allowing more accurate WCET estimates to be calculated. We demonstrate the improvement in accuracy by comparing WCET estimates computed through our HMB framework against those computed with state-of-the-art techniques.
Image analysis of ocular fundus for retinopathy characterization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ushizima, Daniela; Cuadros, Jorge
2010-02-05
Automated analysis of ocular fundus images is a common procedure in countries as England, including both nonemergency examination and retinal screening of patients with diabetes mellitus. This involves digital image capture and transmission of the images to a digital reading center for evaluation and treatment referral. In collaboration with the Optometry Department, University of California, Berkeley, we have tested computer vision algorithms to segment vessels and lesions in ground-truth data (DRIVE database) and hundreds of images of non-macular centric and nonuniform illumination views of the eye fundus from EyePACS program. Methods under investigation involve mathematical morphology (Figure 1) for imagemore » enhancement and pattern matching. Recently, we have focused in more efficient techniques to model the ocular fundus vasculature (Figure 2), using deformable contours. Preliminary results show accurate segmentation of vessels and high level of true-positive microaneurysms.« less
Admixture Aberration Analysis: Application to Mapping in Admixed Population Using Pooled DNA
NASA Astrophysics Data System (ADS)
Bercovici, Sivan; Geiger, Dan
Admixture mapping is a gene mapping approach used for the identification of genomic regions harboring disease susceptibility genes in the case of recently admixed populations such as African Americans. We present a novel method for admixture mapping, called admixture aberration analysis (AAA), that uses a DNA pool of affected admixed individuals. We demonstrate through simulations that AAA is a powerful and economical mapping method under a range of scenarios, capturing complex human diseases such as hypertension and end stage kidney disease. The method has a low false-positive rate and is robust to deviation from model assumptions. Finally, we apply AAA on 600 prostate cancer-affected African Americans, replicating a known risk locus. Simulation results indicate that the method can yield over 96% reduction in genotyping. Our method is implemented as a Java program called AAAmap and is freely available.
Systems Biology-Driven Hypotheses Tested In Vivo: The Need to Advancing Molecular Imaging Tools.
Verma, Garima; Palombo, Alessandro; Grigioni, Mauro; La Monaca, Morena; D'Avenio, Giuseppe
2018-01-01
Processing and interpretation of biological images may provide invaluable insights on complex, living systems because images capture the overall dynamics as a "whole." Therefore, "extraction" of key, quantitative morphological parameters could be, at least in principle, helpful in building a reliable systems biology approach in understanding living objects. Molecular imaging tools for system biology models have attained widespread usage in modern experimental laboratories. Here, we provide an overview on advances in the computational technology and different instrumentations focused on molecular image processing and analysis. Quantitative data analysis through various open source software and algorithmic protocols will provide a novel approach for modeling the experimental research program. Besides this, we also highlight the predictable future trends regarding methods for automatically analyzing biological data. Such tools will be very useful to understand the detailed biological and mathematical expressions under in-silico system biology processes with modeling properties.
Reducing neural network training time with parallel processing
NASA Technical Reports Server (NTRS)
Rogers, James L., Jr.; Lamarsh, William J., II
1995-01-01
Obtaining optimal solutions for engineering design problems is often expensive because the process typically requires numerous iterations involving analysis and optimization programs. Previous research has shown that a near optimum solution can be obtained in less time by simulating a slow, expensive analysis with a fast, inexpensive neural network. A new approach has been developed to further reduce this time. This approach decomposes a large neural network into many smaller neural networks that can be trained in parallel. Guidelines are developed to avoid some of the pitfalls when training smaller neural networks in parallel. These guidelines allow the engineer: to determine the number of nodes on the hidden layer of the smaller neural networks; to choose the initial training weights; and to select a network configuration that will capture the interactions among the smaller neural networks. This paper presents results describing how these guidelines are developed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wackerbarth, David
Sandia National Laboratories has developed a computer program to review, reduce and manipulate waveform data. PlotData is designed for post-acquisition waveform data analysis. PlotData is both a post-acquisition and an advanced interactive data analysis environment. PlotData requires unidirectional waveform data with both uniform and discrete time-series measurements. PlotData operates on a National Instruments' LabVIEW™ software platform. Using PlotData, the user can capture waveform data from digitizing oscilloscopes over a GPIB, USB and Ethernet interface from Tektronix, Lecroy or Agilent scopes. PlotData can both import and export several types of binary waveform files including, but not limited to, Tektronix .wmf files,more » Lecroy.trc files and xy pair ASCIIfiles. Waveform manipulation includes numerous math functions, integration, differentiation, smoothing, truncation, and other specialized data reduction routines such as VISAR, POV, PVDF (Bauer) piezoelectric gauges, and piezoresistive gauges such as carbon manganin pressure gauges.« less
NASA Technical Reports Server (NTRS)
Lindstrom, David J.; Lindstrom, Richard M.
1989-01-01
Prompt gamma activation analysis (PGAA) is a well-developed analytical technique. The technique involves irradiation of samples in an external neutron beam from a nuclear reactor, with simultaneous counting of gamma rays produced in the sample by neutron capture. Capture of neutrons leads to excited nuclei which decay immediately with the emission of energetic gamma rays to the ground state. PGAA has several advantages over other techniques for the analysis of cometary materials: (1) It is nondestructive; (2) It can be used to determine abundances of a wide variety of elements, including most major and minor elements (Na, Mg, Al, Si, P, K, Ca, Ti, Cr, Mn, Fe, Co, Ni), volatiles (H, C, N, F, Cl, S), and some trace elements (those with high neutron capture cross sections, including B, Cd, Nd, Sm, and Gd); and (3) It is a true bulk analysis technique. Recent developments should improve the technique's sensitivity and accuracy considerably.
NASA Astrophysics Data System (ADS)
Bochon, Krzysztof; Chmielniak, Tadeusz
2015-03-01
In the study an accurate energy and economic analysis of the carbon capture installation was carried out. Chemical absorption with the use of monoethanolamine (MEA) and ammonia was adopted as the technology of carbon dioxide (CO2) capture from flue gases. The energy analysis was performed using a commercial software package to analyze the chemical processes. In the case of MEA, the demand for regeneration heat was about 3.5 MJ/kg of CO2, whereas for ammonia it totalled 2 MJ/kg CO2. The economic analysis was based on the net present value (NPV) method. The limit price for CO2 emissions allowances at which the investment project becomes profitable (NPV = 0) was more than 160 PLN/Mg for MEA and less than 150 PLN/Mg for ammonia. A sensitivity analysis was also carried out to determine the limit price of CO2 emissions allowances depending on electricity generation costs at different values of investment expenditures.
ERIC Educational Resources Information Center
Wiediger, Susan D.
2009-01-01
The periodic table and the periodic system are central to chemistry and thus to many introductory chemistry courses. A number of existing activities use various data sets to model the development process for the periodic table. This paper describes an image arrangement computer program developed to mimic a paper-based card sorting periodic table…
ERIC Educational Resources Information Center
Casquejo Johnston, Luz Marie
2016-01-01
This study examined the influence of enrollment in a Montessori adolescent program on the development of self-determination. The study focused on seventh-grade students. Student feelings of self-determination were recorded through three cycles of interviews throughout the year to capture the change, if any, in feelings of self-determination.…
Long-term soil productivity: genesis of the concept and principles behind the program
Robert F. Powers
2006-01-01
The capacity of a forest site to capture carbon and convert it into biomass defines fundamental site productivity. In the United States, the National Forest Management Act (NFMA) of 1976 mandates that this capacity must be protected on federally managed lands. Responding to NFMA, the USDA Forest Service began a soil-based monitoring program for its managed forests....
Developing an Internet-based Survey to Collect Program Cost Data
ERIC Educational Resources Information Center
Caffray, Christine M.; Chatterji, Pinka
2009-01-01
This manuscript describes the development and testing of an Internet-based cost survey that was designed by the authors for the National Assembly on School-Based Health Care (NASBHC) to capture the costs of school-based health programs. The intent of the survey was twofold. First, the survey was designed to collect comprehensive data on costs in a…
Writing a Brochure Is as Easy as 1-2-3. A Kit for Workshop and Program Planners.
ERIC Educational Resources Information Center
Witt, Ted
This kit is intended to help program planners write the information needed for an effective brochure advertising a workshop, seminar, conference, class, or academy. The kit contains the following sections: (1) Benefit Headlines Capture Reader Attention; (2) Establish a Need Quickly; (3) Identifying the Audience; (4) Making Top Names Tops; (5)…
Factors influencing the variation in capture rates of shrews in southern California, USA
Laakkonen, Juha; Fisher, Robert N.; Case, Ted J.
2003-01-01
We examined the temporal variation in capture rates of shrewsNotiosorex crawfordi (Coues, 1877) and Sorex ornatus (Merriam, 1895) in 20 sites representing fragmented and continuous habitats in southern California, USA. InN. crawfordi, the temporal variation was significantly correlated with the mean capture rates. Of the 6 landscape variables analyzed (size of the landscape, size of the sample area, altitude, edge, longitude and latitude), sample area was positively correlated with variation in capture rates ofN. crawfordi. InS. ornatus, longitude was negatively correlated with variation in capture rates. Analysis of the effect of precipitation on the short- and long-term capture rates at 2 of the sites showed no correlation between rainfall and capture rates of shrews even though peak number of shrews at both sites were reached during the year of highest amount of rainfall. A key problem confounding capture rates of shrews in southern California is the low overall abundance of both shrew species in all habitats and seasons.
Ultra-scale Visualization Climate Data Analysis Tools (UV-CDAT)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, Dean N.; Silva, Claudio
2013-09-30
For the past three years, a large analysis and visualization effort—funded by the Department of Energy’s Office of Biological and Environmental Research (BER), the National Aeronautics and Space Administration (NASA), and the National Oceanic and Atmospheric Administration (NOAA)—has brought together a wide variety of industry-standard scientific computing libraries and applications to create Ultra-scale Visualization Climate Data Analysis Tools (UV-CDAT) to serve the global climate simulation and observational research communities. To support interactive analysis and visualization, all components connect through a provenance application–programming interface to capture meaningful history and workflow. Components can be loosely coupled into the framework for fast integrationmore » or tightly coupled for greater system functionality and communication with other components. The overarching goal of UV-CDAT is to provide a new paradigm for access to and analysis of massive, distributed scientific data collections by leveraging distributed data architectures located throughout the world. The UV-CDAT framework addresses challenges in analysis and visualization and incorporates new opportunities, including parallelism for better efficiency, higher speed, and more accurate scientific inferences. Today, it provides more than 600 users access to more analysis and visualization products than any other single source.« less
Capturing method for integral three-dimensional imaging using multiviewpoint robotic cameras
NASA Astrophysics Data System (ADS)
Ikeya, Kensuke; Arai, Jun; Mishina, Tomoyuki; Yamaguchi, Masahiro
2018-03-01
Integral three-dimensional (3-D) technology for next-generation 3-D television must be able to capture dynamic moving subjects with pan, tilt, and zoom camerawork as good as in current TV program production. We propose a capturing method for integral 3-D imaging using multiviewpoint robotic cameras. The cameras are controlled through a cooperative synchronous system composed of a master camera controlled by a camera operator and other reference cameras that are utilized for 3-D reconstruction. When the operator captures a subject using the master camera, the region reproduced by the integral 3-D display is regulated in real space according to the subject's position and view angle of the master camera. Using the cooperative control function, the reference cameras can capture images at the narrowest view angle that does not lose any part of the object region, thereby maximizing the resolution of the image. 3-D models are reconstructed by estimating the depth from complementary multiviewpoint images captured by robotic cameras arranged in a two-dimensional array. The model is converted into elemental images to generate the integral 3-D images. In experiments, we reconstructed integral 3-D images of karate players and confirmed that the proposed method satisfied the above requirements.
Systematic R -matrix analysis of the 13C(p ,γ )14N capture reaction
NASA Astrophysics Data System (ADS)
Chakraborty, Suprita; deBoer, Richard; Mukherjee, Avijit; Roy, Subinit
2015-04-01
Background: The proton capture reaction 13C(p ,γ )14N is an important reaction in the CNO cycle during hydrogen burning in stars with mass greater than the mass of the Sun. It also occurs in astrophysical sites such as red giant stars: the asymptotic giant branch (AGB) stars. The low energy astrophysical S factor of this reaction is dominated by a resonance state at an excitation energy of around 8.06 MeV (Jπ=1-,T =1 ) in 14N. The other significant contributions come from the low energy tail of the broad resonance with Jπ=0-,T =1 at an excitation of 8.78 MeV and the direct capture process. Purpose: Measurements of the low energy astrophysical S factor of the radiative capture reaction 13C(p ,γ )14N reported extrapolated values of S (0 ) that differ by about 30 % . Subsequent R -matrix analysis and potential model calculations also yielded significantly different values for S (0 ) . The present work intends to look into the discrepancy through a detailed R -matrix analysis with emphasis on the associated uncertainties. Method: A systematic reanalysis of the available decay data following the capture to the Jπ=1-,T =1 resonance state of 14N around 8.06 MeV excitation had been performed within the framework of the R -matrix method. A simultaneous analysis of the 13C(p ,p0 ) data, measured over a similar energy range, was carried out with the capture data. The data for the ground state decay of the broad resonance state (Jπ=0-,T =1 ) around 8.78 MeV excitations was included as well. The external capture model along with the background poles to simulate the internal capture contribution were used to estimate the direct capture contribution. The asymptotic normalization constants (ANCs) for all states were extracted from the capture data. The multichannel, multilevel R -matrix code azure2 was used for the calculation. Results: The values of the astrophysical S factor at zero relative energy, resulting from the present analysis, are found to be consistent within the error bars for the two sets of capture data used. However, it is found from the fits to the elastic scattering data that the position of the Jπ=1-,T =1 resonance state is uncertain by about 0.6 keV, preferring an excitation energy value of 8.062 MeV. Also the extracted ANC values for the states of 14N corroborate the values from the transfer reaction studies. The reaction rates from the present calculation are about 10 -15 % lower than the values of the NACRE II compilation but compare well with those from NACRE I. Conclusion: The precise energy of the Jπ=1-,T =1 resonance level around 8.06 MeV in 14N must be determined. Further measurements around and below 100 keV with precision are necessary to reduce the uncertainty in the S -factor value at zero relative energy.
50 CFR 216.93 - Tracking and verification program.
Code of Federal Regulations, 2013 CFR
2013-10-01
... canning company in the 50 states, Puerto Rico, or American Samoa receives a domestic or imported shipment... short tons to the fourth decimal, ocean area of capture (ETP, western Pacific, Indian, eastern and...
50 CFR 216.93 - Tracking and verification program.
Code of Federal Regulations, 2014 CFR
2014-10-01
... canning company in the 50 states, Puerto Rico, or American Samoa receives a domestic or imported shipment... short tons to the fourth decimal, ocean area of capture (ETP, western Pacific, Indian, eastern and...
WHC significant lessons learned 1993--1995
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bickford, J.C.
1997-12-12
A lesson learned as defined in DOE-STD-7501-95, Development of DOE Lessons Learned Programs, is: A ``good work practice`` or innovative approach that is captured and shared to promote repeat applications or an adverse work practice or experience that is captured and shared to avoid a recurrence. The key word in both parts of this definition is ``shared``. This document was published to share a wide variety of recent Hanford experiences with other DOE sites. It also provides a valuable tool to be used in new employee and continuing training programs at Hanford facilities and at other DOE locations. This manualmore » is divided into sections to facilitate extracting appropriate subject material when developing training modules. Many of the bulletins could be categorized into more than one section, however, so examination of other related sections is encouraged.« less
MULTISHOCKED,THREE-DIMENSIONAL SUPERSONIC FLOWFIELDS WITH REAL GAS EFFECTS
NASA Technical Reports Server (NTRS)
Kutler, P.
1994-01-01
This program determines the supersonic flowfield surrounding three-dimensional wing-body configurations of a delta wing. It was designed to provide the numerical computation of three dimensional inviscid, flowfields of either perfect or real gases about supersonic or hypersonic airplanes. The governing equations in conservation law form are solved by a finite difference method using a second order noncentered algorithm between the body and the outermost shock wave, which is treated as a sharp discontinuity. Secondary shocks which form between these boundaries are captured automatically. The flowfield between the body and outermost shock is treated in a shock capturing fashion and therefore allows for the correct formation of secondary internal shocks . The program operates in batch mode, is in CDC update format, has been implemented on the CDC 7600, and requires more than 140K (octal) word locations.
Demonstration of Advanced CO 2 Capture Process Improvements for Coal-Fired Flue Gas
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carroll, John
This document summarizes the activities of Cooperative Agreement DE-FE0026590, “Demonstration of Advanced CO 2 Capture Process Improvements for Coal-Fired Flue Gas” during the performance period of October 1, 2015 through May 31, 2017. This project was funded by the U.S. Department of Energy (DOE) National Energy Technology Laboratory (NETL). Southern Company Services, Inc. (SCS) was the prime contractor and co-funder of the project. Mitsubishi Heavy Industries America (MHIA) and AECOM were project team members. The overall project objective was to improve costs, energy requirements, and performance of an existing amine-based CO 2 capture process. This will occur via improvements inmore » three areas: 1. Reboiler design – The first objective of the program was to demonstrate performance of an integrated stripper/reboiler (termed Built-in Reboiler, or BIR) to reduce footprint, capital costs, and integration issues of the current technology. 2. Particulate management – The second objective was to carry out a Particulate Matter Management (PMM) test. This has the potential to reduce operating costs and capital costs due to the reduced or eliminated need for mechanical filtration. 3. Solvent – The third objective was to carry out a new solvent test plan (referred to as NSL) to demonstrate a new solvent (termed New Solvent A), which is expected to reduce regeneration steam. The bulk price is also expected to be lower than KS-1, which is the current solvent used in this process. NSL testing would include baseline testing, optimization, long term testing, solvent reclamation testing, and final inspection. These combine to form the Advanced Carbon Capture (ACC) technology. Much of this work will be applicable to generic solvent processes, especially in regards to improved reboiler design, and focused to meet or exceed the DOE’s overall carbon capture performance goals of 90% CO 2 capture rate with 95% CO 2 purity at a cost of $40/tonne of CO 2 by 2025 and at a cost of electricity (COE) 30% less than baseline CO 2 capture approaches by 2030. This project was divided into two phases. Phase 1 is the planning phase, and Phase 2 is the construction, operations, testing, and analysis phase. A down select occurred after Phase 1. Phase 1 activities were carried out during this reporting period, and therefore, Phase 1 activities are solely considered in this report. The project was not selected for Phase 2 funding.« less
Detection of K-ras and p53 Mutations in Sputum Samples of Lung Cancer Patients Using Laser Capture Microdissection Microscope and Mutation Analysis
Phouthone Keohavong a,*, Wei-Min Gao a, Kui-Cheng Zheng a, Hussam Mady b, Qing Lan c, Mona Melhem b, and Judy Mumford d.
<...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krutka, Holly; Sjostrom, Sharon
2011-07-31
Through a U.S. Department of Energy (DOE) National Energy Technology Laboratory (NETL) funded cooperative agreement DE-NT0005649, ADA Environmental Solutions (ADA) has begun evaluating the use of solid sorbents for CO{sub 2} capture. The project objective was to address the viability and accelerate development of a solid-based CO{sub 2} capture technology. To meet this objective, initial evaluations of sorbents and the process / equipment were completed. First the sorbents were evaluated using a temperature swing adsorption process at the laboratory scale in a fixed-bed apparatus. A slipstream reactor designed to treat flue gas produced by coal-fired generation of nominally 1 kWemore » was designed and constructed, which was used to evaluate the most promising materials on a more meaningful scale using actual flue gas. In a concurrent effort, commercial-scale processes and equipment options were also evaluated for their applicability to sorbent-based CO{sub 2} capture. A cost analysis was completed that can be used to direct future technology development efforts. ADA completed an extensive sorbent screening program funded primarily through this project, DOE NETL cooperative agreement DE-NT0005649, with support from the Electric Power Research Institute (EPRI) and other industry participants. Laboratory screening tests were completed on simulated and actual flue gas using simulated flue gas and an automated fixed bed system. The following types and quantities of sorbents were evaluated: 87 supported amines, 31 carbon based materials, 6 zeolites, 7 supported carbonates (evaluated under separate funding), 10 hydrotalcites. Sorbent evaluations were conducted to characterize materials and down-select promising candidates for further testing at the slipstream scale. More than half of the materials evaluated during this program were supported amines. Based on the laboratory screening four supported amine sorbents were selected for evaluation at the 1 kW scale at two different field sites. ADA designed and fabricated a slipstream pilot to allow an evaluation of the kinetic behavior of sorbents and provide some flexibility for the physical characteristics of the materials. The design incorporated a transport reactor for the adsorber (co-current reactor) and a fluidized-bed in the regenerator. This combination achieved the sorbent characterization goals and provided an opportunity to evaluate whether the potential cost savings associated with a relatively simple process design could overcome the sacrifices inherent in a co-current separation process. The system was installed at two field sites during the project, Luminant’s Martin Lake Steam Electric Station and Xcel Energy’s Sherburne County Generating Station (Sherco). Although the system could not maintain continuous 90% CO{sub 2} removal with the sorbents evaluated under this program, it was useful to compare the CO{sub 2} removal properties of several different sorbents on actual flue gas. One of the supported amine materials, sorbent R, was evaluated at both Martin Lake and Sherco. The 1 kWe pilot was operated in continuous mode as well as batch mode. In continuous mode, the sorbent performance could not overcome the limitations of the co-current adsorbent design. In batch mode, sorbent R was able to remove up to 90% CO{sub 2} for several cycles. Approximately 50% of the total removal occurred in the first three feet of the adsorption reactor, which was a transport reactor. During continuous testing at Sherco, CO{sub 2} removal decreased to approximately 20% at steady state. The lack of continuous removal was due primarily to the combination of a co-current adsorption system with a fluidized bed for regeneration, a combination which did not provide an adequate driving force to maintain an acceptable working CO{sub 2} capacity. In addition, because sorbent R consisted of a polymeric amine coated on a silica substrate, it was believed that the 50% amine loaded resulted in mass diffusion limitations related to the CO{sub 2} uptake rate. Three additional supported amine materials, sorbents AX, F, and BN, were selected for evaluation using the 1 kW pilot at Sherco. Sorbent AX was operated in batch mode and performed similarly to sorbent R (i.e. could achieve up to 90% removal when given adequate regeneration time). Sorbent BN was not expected to be subject to the same mass diffusion limitations as experienced with sorbent R. When sorbent BN was used in continuous mode the steady state CO{sub 2} removal was approximately double that of sorbent R, which highlighted the importance of sorbents without kinetic limitations. Many different processes and equipment designs exist that may be applicable for postcombustion CO{sub 2} capture using solids in a temperature-swing system. A thorough technology survey was completed to identify the most promising options, which were grouped and evaluated based on the four main unit operations involved with sorbent based capture: Adsorption; Heating and cooling, or heat transfer; Conveying; Desorption. The review included collecting information from a wide variety of sources, including technology databases, published papers, advertisements, web searches, and vendor interviews. Working with power producers, scoring sheets were prepared and used to compare the different technology options. Although several technologies were interesting and promising, those that were selected for the final conceptual design were commercially available and performed multiple steps simultaneously. For the adsorption step, adsorption and conveying were both accomplished in a circulating fluidized bed. A rotary kiln was selected for desorption and cooling because it can simultaneously accomplish conveying and effective heat transfer. The final technology selection was used to complete preliminary costs assessments for a conceptual 500 MW CO{sub 2} capture process. The high level cost analysis was completed to determine the key cost drivers. The conceptual sorbent-based capture options yielded significant energy penalty and cost savings versus an aqueous amine system. Specifically, the estimated levelized cost of electricity (LCOE) for final concept design without a CO{sub 2} laden/lean sorbent heat exchanger or any other integration, was over 30% lower than that of the MEA capture process. However, this cost savings was not enough to meet the DOE’s target of ≤35% increase in LCOE. In order to reach this target, the incremental LCOE due to the CO{sub 2} capture can be no higher than 2.10 ¢/kWh above the LCOE of the non-capture equivalent power plant (6.0 ¢/kWh). Although results of the 1 kWe pilot evaluations suggest that the initial full-scale concept design must be revisited to address the technical targets, the cost assessment still provides a valuable high-level estimate of the potential costs of a solids-based system. A sensitivity analysis was conducted to determine the cost drivers and the results of the sensitivity analysis will be used to direct future technology development efforts. The overall project objective was to assess the viability and accelerate development of a solid-based post-combustion CO{sub 2} capture technology that can be retrofit to the existing fleet of coal-fired power plants. This objective was successfully completed during the project along with several specific budget period goals. Based on sorbent screening and a full-scale equipment evaluation, it was determined that solid sorbents for post-combustion capture is promising and warrants continued development efforts. Specifically, the lower sensible heat could result in a significant reduction in the energy penalty versus solvent based capture systems, if the sorbents can be paired with a process and equipment that takes advantage of the beneficial sorbent properties. It was also determined that a design using a circulating fluidized bed adsorber with rotary kilns for heating during regeneration, cooling, and conveying highlighted the advantage of sorbents versus solvents. However, additional technology development and cost reductions will be required to meet the DOE’s final technology goal of 90% CO{sub 2} capture with ≤35% increase in the cost of electricity. The cost analysis identified specific targets for the capital and operating costs, which will be used as the targets for future technology development efforts.« less
NASA Astrophysics Data System (ADS)
Nurhayati, A.; Purnomo, A. H.
2018-03-01
This research was aimed at analyzing factors influencing capture fisheries losses, focusing on technical, social and economic aspects at Pelabuhan Ratu. A case study was undertaken, through a survey involving 40 respondents. These respondents represented groups of fishers, collectors, middlemen, processors and consumers. The questions delivered in the survey was adapted from the Exploratory Fish Loss Assessment Method (EFLAM). Based on this research, the fish loss was detected in Palabuhan Ratu, which amounted to 4.25 % at the fisher level and 5.12 % in the following supply chains, due to some factors. It was found that among the technical factors, the most influential ones were handling of landed fish, fish sortation, fish size, fish shelf life and season. Among economic aspect, factors with the most significant influence were fish price fluctuation and price level; meanwhile, among the social factors, those that had the most significant influence was the revenue distribution system. Based on this, the relevant policy implication of this research was the need for effective programs which covers the development of cold chain and distribution facilities and infrastructure, and an improvement in skills and knowledge of fish derivative product processors.
The Environmental Control and Life Support System (ECLSS) advanced automation project
NASA Technical Reports Server (NTRS)
Dewberry, Brandon S.; Carnes, Ray
1990-01-01
The objective of the environmental control and life support system (ECLSS) Advanced Automation Project is to influence the design of the initial and evolutionary Space Station Freedom Program (SSFP) ECLSS toward a man-made closed environment in which minimal flight and ground manpower is needed. Another objective includes capturing ECLSS design and development knowledge future missions. Our approach has been to (1) analyze the SSFP ECLSS, (2) envision as our goal a fully automated evolutionary environmental control system - an augmentation of the baseline, and (3) document the advanced software systems, hooks, and scars which will be necessary to achieve this goal. From this analysis, prototype software is being developed, and will be tested using air and water recovery simulations and hardware subsystems. In addition, the advanced software is being designed, developed, and tested using automation software management plan and lifecycle tools. Automated knowledge acquisition, engineering, verification and testing tools are being used to develop the software. In this way, we can capture ECLSS development knowledge for future use develop more robust and complex software, provide feedback to the knowledge based system tool community, and ensure proper visibility of our efforts.
Numerical Propulsion System Simulation: An Overview
NASA Technical Reports Server (NTRS)
Lytle, John K.
2000-01-01
The cost of implementing new technology in aerospace propulsion systems is becoming prohibitively expensive and time consuming. One of the main contributors to the high cost and lengthy time is the need to perform many large-scale hardware tests and the inability to integrate all appropriate subsystems early in the design process. The NASA Glenn Research Center is developing the technologies required to enable simulations of full aerospace propulsion systems in sufficient detail to resolve critical design issues early in the design process before hardware is built. This concept, called the Numerical Propulsion System Simulation (NPSS), is focused on the integration of multiple disciplines such as aerodynamics, structures and heat transfer with computing and communication technologies to capture complex physical processes in a timely and cost-effective manner. The vision for NPSS, as illustrated, is to be a "numerical test cell" that enables full engine simulation overnight on cost-effective computing platforms. There are several key elements within NPSS that are required to achieve this capability: 1) clear data interfaces through the development and/or use of data exchange standards, 2) modular and flexible program construction through the use of object-oriented programming, 3) integrated multiple fidelity analysis (zooming) techniques that capture the appropriate physics at the appropriate fidelity for the engine systems, 4) multidisciplinary coupling techniques and finally 5) high performance parallel and distributed computing. The current state of development in these five area focuses on air breathing gas turbine engines and is reported in this paper. However, many of the technologies are generic and can be readily applied to rocket based systems and combined cycles currently being considered for low-cost access-to-space applications. Recent accomplishments include: (1) the development of an industry-standard engine cycle analysis program and plug 'n play architecture, called NPSS Version 1, (2) A full engine simulation that combines a 3D low-pressure subsystem with a 0D high pressure core simulation. This demonstrates the ability to integrate analyses at different levels of detail and to aerodynamically couple components, the fan/booster and low-pressure turbine, through a 3D computational fluid dynamics simulation. (3) Simulation of all of the turbomachinery in a modern turbofan engine on parallel computing platform for rapid and cost-effective execution. This capability can also be used to generate full compressor map, requiring both design and off-design simulation. (4) Three levels of coupling characterize the multidisciplinary analysis under NPSS: loosely coupled, process coupled and tightly coupled. The loosely coupled and process coupled approaches require a common geometry definition to link CAD to analysis tools. The tightly coupled approach is currently validating the use of arbitrary Lagrangian/Eulerian formulation for rotating turbomachinery. The validation includes both centrifugal and axial compression systems. The results of the validation will be reported in the paper. (5) The demonstration of significant computing cost/performance reduction for turbine engine applications using PC clusters. The NPSS Project is supported under the NASA High Performance Computing and Communications Program.
Chung, Arlene; Battaglioli, Nicole; Lin, Michelle; Sherbino, Jonathan
2018-02-01
Physician well-being is garnering increasing attention. In 2016, the Journal of Graduate Medical Education ( JGME ) published a review by Kristin Raj, MD, entitled "Well-Being in Residency: A Systematic Review." There is benefit in contextualizing the literature on resident well-being through an academic journal club. We summarized an asynchronous, online journal club discussion about this systematic review and highlighted themes that were identified in the review. In January 2017, JGME and the Academic Life in Emergency Medicine (ALiEM) blog facilitated an open-access, online, weeklong journal club on the featured JGME article. Online discussions and interactions were facilitated via blog posts and comments, a video discussion on Google Hangouts on Air, and Twitter. We performed a thematic analysis of the discussion and captured web analytics. Over the first 14 days, the blog post was viewed 1070 unique times across 52 different countries. A total of 130 unique participants on Twitter posted 480 tweets using the hashtag #JGMEscholar. Thematic analysis revealed 5 major domains: the multidimensional nature of well-being, measurement of well-being, description of wellness programs and interventions, creation of a culture of wellness, and critique of the methodology of the review. Our online journal club highlighted several gaps in the current understanding of resident well-being, including the need for consensus on the operational definition, the need for effective instruments to evaluate wellness programs and identify residents in distress, and a national research collaboration to assess wellness programs and their impact on resident well-being.
Can the national surgical quality improvement program provide surgeon-specific outcomes?
Kuhnen, Angela H; Marcello, Peter W; Roberts, Patricia L; Read, Thomas E; Schoetz, David J; Rusin, Lawrence C; Hall, Jason F; Ricciardi, Rocco
2015-02-01
Efforts to improve the quality of surgical care and reduce morbidity and mortality have resulted in outcomes reporting at the service and institutional level. Surgeon-specific outcomes are not readily available. The aim of this study is to compare surgeon-specific outcomes from the National Surgical Quality Improvement Program and 100% capture institutional quality data. We conducted a cohort study evaluating institutional and surgeon-specific outcomes following colorectal surgery procedures at 1 institution over 5 years. All patients who underwent an operation by a colorectal surgeon at Lahey Hospital & Medical Center from January 1, 2008 through December 31, 2012 were identified. Thirty-day mortality, reoperation, urinary tract infection, deep vein thrombosis, pneumonia, superficial surgical site infection, and organ space infection were the primary outcomes measured. We compared annual and 5-year institutional and surgeon-specific adverse event rates between the data sets. In addition, we categorized individual surgeons as low-outlier, average, or high-outlier in relation to aggregate averages and determined the concordance between the data sets in identifying outliers. Concordance was designated if the 2 databases classified outlier status similarly for the same adverse event category. In the 100% capture institutional data, 6459 operative encounters were identified in comparison with 1786 National Surgical Quality Improvement Program encounters (28% sampled). Annual aggregate adverse event rates were similar between the institutional data and the National Surgical Quality Improvement Program. For annual surgeon-specific comparisons, concordance in identifying outliers between the 2 data sets was 51.4%, and gross discordance between outlier status was in 8.2%. Five-year surgeon-specific comparisons demonstrated 59% concordance in identifying outlier status with 8.2% gross discordance for the group. The inclusion of data from only 1 academic referral center is a limitation of this study. Each surgeon was identified as a "high outlier" in at least 1 adverse event category. Comparisons at the annual and 5-year points demonstrated poor concordance between our 100% capture institutional data and the National Surgical Quality Improvement Program data.
Douglas, Erik S; Hsiao, Sonny C; Onoe, Hiroaki; Bertozzi, Carolyn R; Francis, Matthew B; Mathies, Richard A
2009-07-21
A microdevice is developed for DNA-barcode directed capture of single cells on an array of pH-sensitive microelectrodes for metabolic analysis. Cells are modified with membrane-bound single-stranded DNA, and specific single-cell capture is directed by the complementary strand bound in the sensor area of the iridium oxide pH microelectrodes within a microfluidic channel. This bifunctional microelectrode array is demonstrated for the pH monitoring and differentiation of primary T cells and Jurkat T lymphoma cells. Single Jurkat cells exhibited an extracellular acidification rate of 11 milli-pH min(-1), while primary T cells exhibited only 2 milli-pH min(-1). This system can be used to capture non-adherent cells specifically and to discriminate between visually similar healthy and cancerous cells in a heterogeneous ensemble based on their altered metabolic properties.
Douglas, Erik S.; Hsiao, Sonny C.; Onoe, Hiroaki; Bertozzi, Carolyn R.; Francis, Matthew B.; Mathies, Richard A.
2010-01-01
A microdevice is developed for DNA-barcode directed capture of single cells on an array of pH-sensitive microelectrodes for metabolic analysis. Cells are modified with membrane-bound single-stranded DNA, and specific single-cell capture is directed by the complementary strand bound in the sensor area of the iridium oxide pH microelectrodes within a microfluidic channel. This bifunctional microelectrode array is demonstrated for the pH monitoring and differentiation of primary T cells and Jurkat T lymphoma cells. Single Jurkat cells exhibited an extracellular acidification rate of 11 milli-pH min−1, while primary T cells exhibited only 2 milli-pH min−1. This system can be used to capture non-adherent cells specifically and to discriminate between visually similar healthy and cancerous cells in a heterogeneous ensemble based on their altered metabolic properties. PMID:19568668
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zollman, Richard L.; Eschler, Russell; Sealey, Shawn
2009-03-31
The Nez Perce Tribe (NPT), through funding provided by the Bonneville Power Administration (BPA), has implemented a Chinook salmon supplementation program (250,000 smolts) on the Lostine River, a tributary to the Grande Ronde River of Oregon. The Grande Ronde Endemic Spring Chinook Salmon Supplementation project, which involves supplementation of the Upper Grande Ronde River and Catherine Creek in addition to the Lostine River, was established to prevent extirpation and increase the number of threatened Snake River spring/summer Chinook salmon (Oncorhynchus tshawytscha) returning to the Grande Ronde River. This report covers the seventh season (1997-2003) of adult Chinook salmon broodstock collectionmore » in the Lostine River and the fifth season (1999-2003) of acclimating the resultant progeny. Production of Lostine River spring Chinook salmon smolts currently occurs at Lookingglass Fish Hatchery (LGH). The Lostine River supplementation program utilizes two strategies to obtain egg source for production of smolts for supplementation: captive broodstock and conventional broodstock. The captive broodstock strategy involves (1) capture of natural juvenile spring Chinook salmon smolts from the Lostine River, (2) rearing those to adult and spawning them, and (3) rearing the resultant progeny for eventual acclimation and release back into the Lostine River. The conventional broodstock strategy involves (1) capture of natural and hatchery origin adults returning to the Lostine River, (2) holding those adults and spawning them, and (3) rearing the resultant progeny for acclimation and release back into the Lostine River. This report focuses on (1) the trapping and collection of adult spring Chinook salmon that return to the Lostine River, which provides the broodstock source for the conventional strategy and (2) the acclimation and release of juvenile spring Chinook salmon produced from the captive broodstock and conventional broodstock strategies. In 2003, acclimation of Lostine River spring Chinook salmon smolts occurred from March 3, 2003 through to April 14, 2003 and a total of 242,776 smolts were acclimated and released. These smolts were produced from the brood year (BY) 2001 egg source and included captive broodstock (141,860) and conventional broodstock (100,916) origin smolts that were all progeny of Lostine River spring Chinook salmon. Operation of the Lostine River adult monitoring and collection facility in 2003 began April 30th, the first Chinook was captured on May 16, 2003 and the last Chinook was captured on September 21, 2003. The weir and trap were removed on October 1, 2003. A total of 464 adult Chinook, including jacks, were captured during the season. The composition of the run included 239 natural origin fish and 225 hatchery supplementation fish. There were no identified 'stray' hatchery fish from other programs trapped. Of the fish captured, 45 natural and 4 hatchery supplementation adults were retained for broodstock and transported to LGH for holding and spawning, 366 adult Chinook were passed or transported above the weir to spawn naturally, and 49 hatchery origin adult jack Chinook were transported and outplanted in the Wallowa River and Bear Creek to spawn in underseeded habitat. Of the 49 adults retained for broodstock at Lookingglass Hatchery, 21 natural females and no hatchery origin females were represented in spawning. These females produced a total of 106,609 eggs at fertilization. Eye-up was 95.50% which yielded a total of 101,811 conventional program eyed eggs. The fecundity averaged 5,077 eggs per female. These eggs were incubated and at Lookingglass Hatchery until eyed stage. At eye they were transferred to Oxbow Hatchery where they were reared to the fingerling state at which time they were transported back to LGH until they were smolts in the spring of 2005. Captive brood program eggs/fish will be added to the conventional program eggs to make up the entire juvenile release for the Lostine River program in 2005.« less
Yates, Kenneth; Sullivan, Maura; Clark, Richard
2012-01-01
Cognitive task analysis (CTA) methods were used for 2 surgical procedures to determine (1) the extent that experts omitted critical information, (2) the number of experts required to capture the optimalamount of information, and (3) the effectiveness of a CTA-informed curriculum. Six expert physicians for both the central venous catheter placement and open cricothyrotomy were interviewed. The transcripts were coded, corrected, and aggregated as a "gold standard." The information captured for each surgeon was then analyzed against the gold standard. Experts omitted an average of 34% of the decisions for the central venous catheter and 77% of the decisions for the Cric. Three to 4 experts were required to capture the optimal amount of information. A significant positive effect on performance (t([21]) = 2.08, P = .050), and self-efficacy ratings (t([18]) = 2.38, P = .029) were found for the CTA-informed curriculum for cricothyrotomy. CTA is an effective method to capture expertise in surgery and a valuable component to improve surgical training. Copyright © 2012 Elsevier Inc. All rights reserved.
Capture mechanism in Palaeotropical pitcher plants (Nepenthaceae) is constrained by climate
Moran, Jonathan A.; Gray, Laura K.; Clarke, Charles; Chin, Lijin
2013-01-01
Background and Aims Nepenthes (Nepenthaceae, approx. 120 species) are carnivorous pitcher plants with a centre of diversity comprising the Philippines, Borneo, Sumatra and Sulawesi. Nepenthes pitchers use three main mechanisms for capturing prey: epicuticular waxes inside the pitcher; a wettable peristome (a collar-shaped structure around the opening); and viscoelastic fluid. Previous studies have provided evidence suggesting that the first mechanism may be more suited to seasonal climates, whereas the latter two might be more suited to perhumid environments. In this study, this idea was tested using climate envelope modelling. Methods A total of 94 species, comprising 1978 populations, were grouped by prey capture mechanism (large peristome, small peristome, waxy, waxless, viscoelastic, non-viscoelastic, ‘wet’ syndrome and ‘dry’ syndrome). Nineteen bioclimatic variables were used to model habitat suitability at approx. 1 km resolution for each group, using Maxent, a presence-only species distribution modelling program. Key Results Prey capture groups putatively associated with perhumid conditions (large peristome, waxless, viscoelastic and ‘wet’ syndrome) had more restricted areas of probable habitat suitability than those associated putatively with less humid conditions (small peristome, waxy, non-viscoelastic and‘dry’ syndrome). Overall, the viscoelastic group showed the most restricted area of modelled suitable habitat. Conclusions The current study is the first to demonstrate that the prey capture mechanism in a carnivorous plant is constrained by climate. Nepenthes species employing peristome-based and viscoelastic fluid-based capture are largely restricted to perhumid regions; in contrast, the wax-based mechanism allows successful capture in both perhumid and more seasonal areas. Possible reasons for the maintenance of peristome-based and viscoelastic fluid-based capture mechanisms in Nepenthes are discussed in relation to the costs and benefits associated with a given prey capture strategy. PMID:23975653
Capture mechanism in Palaeotropical pitcher plants (Nepenthaceae) is constrained by climate.
Moran, Jonathan A; Gray, Laura K; Clarke, Charles; Chin, Lijin
2013-11-01
Nepenthes (Nepenthaceae, approx. 120 species) are carnivorous pitcher plants with a centre of diversity comprising the Philippines, Borneo, Sumatra and Sulawesi. Nepenthes pitchers use three main mechanisms for capturing prey: epicuticular waxes inside the pitcher; a wettable peristome (a collar-shaped structure around the opening); and viscoelastic fluid. Previous studies have provided evidence suggesting that the first mechanism may be more suited to seasonal climates, whereas the latter two might be more suited to perhumid environments. In this study, this idea was tested using climate envelope modelling. A total of 94 species, comprising 1978 populations, were grouped by prey capture mechanism (large peristome, small peristome, waxy, waxless, viscoelastic, non-viscoelastic, 'wet' syndrome and 'dry' syndrome). Nineteen bioclimatic variables were used to model habitat suitability at approx. 1 km resolution for each group, using Maxent, a presence-only species distribution modelling program. Prey capture groups putatively associated with perhumid conditions (large peristome, waxless, viscoelastic and 'wet' syndrome) had more restricted areas of probable habitat suitability than those associated putatively with less humid conditions (small peristome, waxy, non-viscoelastic and'dry' syndrome). Overall, the viscoelastic group showed the most restricted area of modelled suitable habitat. The current study is the first to demonstrate that the prey capture mechanism in a carnivorous plant is constrained by climate. Nepenthes species employing peristome-based and viscoelastic fluid-based capture are largely restricted to perhumid regions; in contrast, the wax-based mechanism allows successful capture in both perhumid and more seasonal areas. Possible reasons for the maintenance of peristome-based and viscoelastic fluid-based capture mechanisms in Nepenthes are discussed in relation to the costs and benefits associated with a given prey capture strategy.
Transient flow analysis linked to fast pressure disturbance monitored in pipe systems
NASA Astrophysics Data System (ADS)
Kueny, J. L.; Lourenco, M.; Ballester, J. L.
2012-11-01
EDF Hydro Division has launched the RENOUVEAU program in order to increase performance and improve plant availability through anticipation. Due to this program, a large penstocks fleet is equipped with pressure transducers linked to a special monitoring system. Any significant disturbance of the pressure is captured in a snapshot and the waveform of the signal is stored and analyzed. During these transient states, variations in flow are unknown. In order to determine the structural impact of such overpressure occurring during complex transients conditions over the entire circuit, EDF DTG has asked ENSE3 GRENOBLE to develop a code called ACHYL CF*. The input data of ACHYL CF are circuit topology and pressure boundaries conditions. This article provide a description of the computer code developed for modeling the transient flow in a pipe network using the signals from pressure transducers as boundary conditions. Different test cases will be presented, simulating real hydro power plants for which measured pressure signals are available.
CAS2D: FORTRAN program for nonrotating blade-to-blade, steady, potential transonic cascade flows
NASA Technical Reports Server (NTRS)
Dulikravich, D. S.
1980-01-01
An exact, full-potential-equation (FPE) model for the steady, irrotational, homentropic and homoenergetic flow of a compressible, homocompositional, inviscid fluid through two dimensional planar cascades of airfoils was derived, together with its appropriate boundary conditions. A computer program, CAS2D, was developed that numerically solves an artificially time-dependent form of the actual FPE. The governing equation was discretized by using type-dependent, rotated finite differencing and the finite area technique. The flow field was discretized by providing a boundary-fitted, nonuniform computational mesh. The mesh was generated by using a sequence of conforming mapping, nonorthogonal coordinate stretching, and local, isoparametric, bilinear mapping functions. The discretized form of the FPE was solved iteratively by using successive line overrelaxation. The possible isentropic shocks were correctly captured by adding explicitly an artificial viscosity in a conservative form. In addition, a three-level consecutive, mesh refinement feature makes CAS2D a reliable and fast algorithm for the analysis of transonic, two dimensional cascade flows.
Mapreduce is Good Enough? If All You Have is a Hammer, Throw Away Everything That's Not a Nail!
Lin, Jimmy
2013-03-01
Hadoop is currently the large-scale data analysis "hammer" of choice, but there exist classes of algorithms that aren't "nails" in the sense that they are not particularly amenable to the MapReduce programming model. To address this, researchers have proposed MapReduce extensions or alternative programming models in which these algorithms can be elegantly expressed. This article espouses a very different position: that MapReduce is "good enough," and that instead of trying to invent screwdrivers, we should simply get rid of everything that's not a nail. To be more specific, much discussion in the literature surrounds the fact that iterative algorithms are a poor fit for MapReduce. The simple solution is to find alternative, noniterative algorithms that solve the same problem. This article captures my personal experiences as an academic researcher as well as a software engineer in a "real-world" production analytics environment. From this combined perspective, I reflect on the current state and future of "big data" research.
Enhancing community based health programs in Iran: a multi-objective location-allocation model.
Khodaparasti, S; Maleki, H R; Jahedi, S; Bruni, M E; Beraldi, P
2017-12-01
Community Based Organizations (CBOs) are important health system stakeholders with the mission of addressing the social and economic needs of individuals and groups in a defined geographic area, usually no larger than a county. The access and success efforts of CBOs vary, depending on the integration between health care providers and CBOs but also in relation to the community participation level. To achieve widespread results, it is important to carefully design an efficient network which can serve as a bridge between the community and the health care system. This study addresses this challenge through a location-allocation model that deals with the hierarchical nature of the system explicitly. To reflect social welfare concerns of equity, local accessibility, and efficiency, we develop the model in a multi-objective framework, capturing the ambiguity in the decision makers' aspiration levels through a fuzzy goal programming approach. This study reports the findings for the real case of Shiraz city, Fars province, Iran, obtained by a thorough analysis of the results.
Robust object tracking techniques for vision-based 3D motion analysis applications
NASA Astrophysics Data System (ADS)
Knyaz, Vladimir A.; Zheltov, Sergey Y.; Vishnyakov, Boris V.
2016-04-01
Automated and accurate spatial motion capturing of an object is necessary for a wide variety of applications including industry and science, virtual reality and movie, medicine and sports. For the most part of applications a reliability and an accuracy of the data obtained as well as convenience for a user are the main characteristics defining the quality of the motion capture system. Among the existing systems for 3D data acquisition, based on different physical principles (accelerometry, magnetometry, time-of-flight, vision-based), optical motion capture systems have a set of advantages such as high speed of acquisition, potential for high accuracy and automation based on advanced image processing algorithms. For vision-based motion capture accurate and robust object features detecting and tracking through the video sequence are the key elements along with a level of automation of capturing process. So for providing high accuracy of obtained spatial data the developed vision-based motion capture system "Mosca" is based on photogrammetric principles of 3D measurements and supports high speed image acquisition in synchronized mode. It includes from 2 to 4 technical vision cameras for capturing video sequences of object motion. The original camera calibration and external orientation procedures provide the basis for high accuracy of 3D measurements. A set of algorithms as for detecting, identifying and tracking of similar targets, so for marker-less object motion capture is developed and tested. The results of algorithms' evaluation show high robustness and high reliability for various motion analysis tasks in technical and biomechanics applications.
Kim, Kyung Seok; Sappington, Thomas W; Allen, Charles T
2008-12-01
Thirty-seven boll weevils, Anthonomus grandis grandis Boheman (Coleoptera: Curculionidae), were captured in pheromone traps near Lubbock, TX, in the Southern High Plains/Caprock eradication zone during August-October 2006. No boll weevils had been captured in this zone or neighboring zones to the north earlier in the year, and only very low numbers had been captured in neighboring zones to the south and east. Therefore, the captures near Lubbock were unexpected. Five of the weevils captured the last week of August were preserved and genotyped at 10 microsatellite loci for comparison with a database of genotypes for 22 boll weevil populations sampled from eight U.S. states and four locations in Mexico. The Lubbock population itself is an unlikely source, suggesting that the captured weevils probably did not originate from a low-level endemic population. Populations from eastern states, Mexico, and Big Spring, TX, can be confidently excluded as potential source regions. Although the Weslaco and Kingsville, TX, areas cannot be statistically excluded, they are unlikely sources. The most likely sources are nearby areas in New Mexico, TX, or southwest Oklahoma, or from areas of eastern Texas represented by Waxahachie and El Campo populations. Together, genetic and circumstantial evidence suggest either that the trapped boll weevils are the offspring of alone mated female that immigrated from eastern Texas earlier in the summer or that weevils originally captured near Waxahachie but now long-dead were planted in the traps by a disgruntled employee of the eradication program.
Thibault, Bernard; Roy, Denis; Guerra, Peter G; Macle, Laurent; Dubuc, Marc; Gagné, Pierre; Greiss, Isabelle; Novak, Paul; Furlani, Aldo; Talajic, Mario
2005-07-01
Cardiac resynchronization therapy (CRT) has been shown to improve symptoms of patients with moderate to severe heart failure. Optimal CRT involves biventricular or left ventricular (LV) stimulation alone, atrio-ventricular (AV) delay optimization, and possibly interventricular timing adjustment. Recently, anodal capture of the right ventricle (RV) has been described for patients with CRT-pacemakers. It is unknown whether the same phenomenon exists in CRT systems associated with defibrillators (CRT-ICD). The RV leads used in these systems are different from pacemaker leads: they have a larger diameter and shocking coils, which may affect the occurrence of anodal capture. We looked for anodal RV capture during LV stimulation in 11 consecutive patients who received a CRT-ICD system with RV leads with a true bipolar design. Fifteen patients who had RV leads with an integrated design were used as controls. Anodal RV and LV thresholds were determined at pulse width (pw) durations of 0.2, 0.5, and 1.0 ms. RV anodal capture during LV pacing was found in 11/11 patients at some output with true bipolar RV leads versus 0/15 patients with RV leads with an integrated bipolar design. Anodal RV capture threshold was more affected by changes in pw duration than LV capture threshold. In CRT-ICD systems, RV leads with a true bipolar design with the proximal ring also used as the anode for LV pacing are associated with a high incidence of anodal RV capture during LV pacing. This may affect the clinical response to alternative resynchronization methods using single LV stimulation or interventricular delay programming.
Saito, Mika; Nakata, Katsushi; Nishijima, Taku; Yamashita, Katsuhiro; Saito, Anna; Ogura, Go
2009-06-01
A project to eradicate invasive small Asian mongooses (Herpestes javanicus) is underway to conserve the unique ecosystem of Okinawa Island, Japan. In the present study, we tried to elucidate whether the mongoose is a host of Japanese encephalitis virus (JEV) and to evaluate the reliability of surveillance of Japanese encephalitis (JE) using this species. Culex tritaeniorhynchus, the main vector mosquito of JEV, feeds on the mongoose. Eighty-five (35.4%) of 240 wild small Asian mongooses captured between 2001 and 2005 had neutralizing antibodies against more than one of four JEV strains. Prevalence rates of JEV antibodies tended to increase with body weight and length of the animals. One of three sentinel mongooses showed a temporal change in antibody titer. These results indicate that the small Asian mongooses on Okinawa Island are sensitive to JEV. From the antibody titers and the locations of capture, the JEV active area was clarified. We propose that surveillance of JE using mongooses captured under the eradication program is reliable.
First Light with the NRAO Transient Event Capture Hardware
NASA Astrophysics Data System (ADS)
Langston, Glen; Rumberg, B.; Brandt, P.
2007-12-01
The design, implementation and testing of the first NRAO Event Capture data acquisition system is presented. The NRAO in Green Bank is developing a set of new data acquisition systems based on the U.C. Berkeley CASPER IBOB/ADC/BEE2 hardware. We describe the hardware configuration and initial experiences with the development system. We present first astronomical tests of the Event Capture system, using the 43m telescope (140ft). These observations were carried out at 900 MHz. The observations were made on 2007 July 8 and 9 towards the Crab pulsar, the galactic center, the Moon and two test observations while the 43m was pointed at Zenith (straight up). The Event Capture is one of several on-going FPGA based data acquisition projects being implemented for the Robert C. Byrd Green Bank Telescope (GBT) and for the 43m telescopes. The NRAO Configurable Instrument Collaboration for Agile Data Acquisition (CICADA) program is described at: http://wikio.nrao.edu/bin/view/CICADA
Rockers, Peter C; Tugwell, Peter; Røttingen, John-Arne; Bärnighausen, Till
2017-09-01
Although the number of quasi-experiments conducted by health researchers has increased in recent years, there clearly remains unrealized potential for using these methods for causal evaluation of health policies and programs globally. This article proposes five prescriptions for capturing the full value of quasi-experiments for health research. First, new funding opportunities targeting proposals that use quasi-experimental methods should be made available to a broad pool of health researchers. Second, administrative data from health programs, often amenable to quasi-experimental analysis, should be made more accessible to researchers. Third, training in quasi-experimental methods should be integrated into existing health science graduate programs to increase global capacity to use these methods. Fourth, clear guidelines for primary research and synthesis of evidence from quasi-experiments should be developed. Fifth, strategic investments should be made to continue to develop new innovations in quasi-experimental methodologies. Tremendous opportunities exist to expand the use of quasi-experimental methods to increase our understanding of which health programs and policies work and which do not. Health researchers should continue to expand their commitment to rigorous causal evaluation with quasi-experimental methods, and international institutions should increase their support for these efforts. Copyright © 2017 Elsevier Inc. All rights reserved.
Mbaeyi, Chukwuma; Kamawal, Noor Shah; Porter, Kimberly A; Azizi, Adam Khan; Sadaat, Iftekhar; Hadler, Stephen; Ehrhardt, Derek
2017-07-01
The Basic Package of Health Services (BPHS) program has increased access to immunization services for children living in rural Afghanistan. However, multiple surveys have indicated persistent immunization coverage gaps. Hence, to identify gaps in implementation, an assessment of the BPHS program was undertaken, with specific focus on the routine immunization (RI) component. A cross-sectional survey was conducted in 2014 on a representative sample drawn from a sampling frame of 1858 BPHS health facilities. Basic descriptive analysis was performed, capturing general characteristics of survey respondents and assessing specific RI components, and χ2 tests were used to evaluate possible differences in service delivery by type of health facility. Of 447 survey respondents, 27% were health subcenters (HSCs), 30% were basic health centers, 32% were comprehensive health centers, and 12% were district hospitals. Eighty-seven percent of all respondents offered RI services, though only 61% of HSCs did so. Compared with other facility types, HSCs were less likely to have adequate stock of vaccines, essential cold-chain equipment, or proper documentation of vaccination activities. There is an urgent need to address manpower and infrastructural deficits in RI service delivery through the BPHS program, especially at the HSC level. © The Author 2017. Published by Oxford University Press for the Infectious Diseases Society of America.
Dielectrophoretic Capture and Genetic Analysis of Single Neuroblastoma Tumor Cells
Carpenter, Erica L.; Rader, JulieAnn; Ruden, Jacob; Rappaport, Eric F.; Hunter, Kristen N.; Hallberg, Paul L.; Krytska, Kate; O’Dwyer, Peter J.; Mosse, Yael P.
2014-01-01
Our understanding of the diversity of cells that escape the primary tumor and seed micrometastases remains rudimentary, and approaches for studying circulating and disseminated tumor cells have been limited by low throughput and sensitivity, reliance on single parameter sorting, and a focus on enumeration rather than phenotypic and genetic characterization. Here, we utilize a highly sensitive microfluidic and dielectrophoretic approach for the isolation and genetic analysis of individual tumor cells. We employed fluorescence labeling to isolate 208 single cells from spiking experiments conducted with 11 cell lines, including 8 neuroblastoma cell lines, and achieved a capture sensitivity of 1 tumor cell per 106 white blood cells (WBCs). Sample fixation or freezing had no detectable effect on cell capture. Point mutations were accurately detected in the whole genome amplification product of captured single tumor cells but not in negative control WBCs. We applied this approach to capture 144 single tumor cells from 10 bone marrow samples of patients suffering from neuroblastoma. In this pediatric malignancy, high-risk patients often exhibit wide-spread hematogenous metastasis, but access to primary tumor can be difficult or impossible. Here, we used flow-based sorting to pre-enrich samples with tumor involvement below 0.02%. For all patients for whom a mutation in the Anaplastic Lymphoma Kinase gene had already been detected in their primary tumor, the same mutation was detected in single cells from their marrow. These findings demonstrate a novel, non-invasive, and adaptable method for the capture and genetic analysis of single tumor cells from cancer patients. PMID:25133137
Jit, Mark; Levin, Carol; Brisson, Marc; Levin, Ann; Resch, Stephen; Berkhof, Johannes; Kim, Jane; Hutubessy, Raymond
2013-01-30
Low- and middle-income countries need to consider economic issues such as cost-effectiveness, affordability and sustainability before introducing a program for human papillomavirus (HPV) vaccination. However, many such countries lack the technical capacity and data to conduct their own analyses. Analysts informing policy decisions should address the following questions: 1) Is an economic analysis needed? 2) Should analyses address costs, epidemiological outcomes, or both? 3) If costs are considered, what sort of analysis is needed? 4) If outcomes are considered, what sort of model should be used? 5) How complex should the analysis be? 6) How should uncertainty be captured? 7) How should model results be communicated? Selecting the appropriate analysis is essential to ensure that all the important features of the decision problem are correctly represented, but that the analyses are not more complex than necessary. This report describes the consensus of an expert group convened by the World Health Organization, prioritizing key issues to be addressed when considering economic analyses to support HPV vaccine introduction in these countries.
RepExplore: addressing technical replicate variance in proteomics and metabolomics data analysis.
Glaab, Enrico; Schneider, Reinhard
2015-07-01
High-throughput omics datasets often contain technical replicates included to account for technical sources of noise in the measurement process. Although summarizing these replicate measurements by using robust averages may help to reduce the influence of noise on downstream data analysis, the information on the variance across the replicate measurements is lost in the averaging process and therefore typically disregarded in subsequent statistical analyses.We introduce RepExplore, a web-service dedicated to exploit the information captured in the technical replicate variance to provide more reliable and informative differential expression and abundance statistics for omics datasets. The software builds on previously published statistical methods, which have been applied successfully to biomedical omics data but are difficult to use without prior experience in programming or scripting. RepExplore facilitates the analysis by providing a fully automated data processing and interactive ranking tables, whisker plot, heat map and principal component analysis visualizations to interpret omics data and derived statistics. Freely available at http://www.repexplore.tk enrico.glaab@uni.lu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.
Shambira, Gerald; Gombe, Notion Tafara; Hall, Casey Daniel; Park, Meeyoung Mattie; Frimpong, Joseph Asamoah
2017-01-01
The government of Zimbabwe began providing antiretroviral therapy (ART) to People Living with HIV/AIDS (PLHIV) in public institutions in 2004. In Midlands province two clinics constituted the most active HIV care service points, with patients being followed up through a comprehensive patient monitoring and tracking system which captured specific patient variables and outcomes over time. The data from 2006 to 2011 were subjected to analysis to answer specific research questions and this case study is based on that analysis. The goal of this case study is to build participants' capacity to undertake secondary data analysis and interpretation using a dataset for HIV antiretroviral therapy in Zimbabwe and to draw conclusions which inform recommendations. Case studies in applied epidemiology allow students to practice applying epidemiologic skills in the classroom to address real-world public health problems. Case studies as a vital component of an applied epidemiology curriculum are instrumental in reinforcing principles and skills covered in lectures or in background reading. The target audience includes Field Epidemiology and Laboratory Training Programs (FELTPs), university students, district health executives, and health information officers.
NASA Technical Reports Server (NTRS)
Arnold, Steven M.; Gendy, Atef; Saleeb, Atef F.; Mark, John; Wilt, Thomas E.
2007-01-01
Two reports discuss, respectively, (1) the generalized viscoplasticity with potential structure (GVIPS) class of mathematical models and (2) the Constitutive Material Parameter Estimator (COMPARE) computer program. GVIPS models are constructed within a thermodynamics- and potential-based theoretical framework, wherein one uses internal state variables and derives constitutive equations for both the reversible (elastic) and the irreversible (viscoplastic) behaviors of materials. Because of the underlying potential structure, GVIPS models not only capture a variety of material behaviors but also are very computationally efficient. COMPARE comprises (1) an analysis core and (2) a C++-language subprogram that implements a Windows-based graphical user interface (GUI) for controlling the core. The GUI relieves the user of the sometimes tedious task of preparing data for the analysis core, freeing the user to concentrate on the task of fitting experimental data and ultimately obtaining a set of material parameters. The analysis core consists of three modules: one for GVIPS material models, an analysis module containing a specialized finite-element solution algorithm, and an optimization module. COMPARE solves the problem of finding GVIPS material parameters in the manner of a design-optimization problem in which the parameters are the design variables.
Dissociative recombination of O2(+), NO(+) and N2(+)
NASA Technical Reports Server (NTRS)
Guberman, S. L.
1983-01-01
A new L(2) approach for the calculation of the threshold molecular capture width needed for the determination of DR cross sections was developed. The widths are calculated with Fermi's golden rule by substituting Rydberg orbitals for the free electron continuum coulomb orbital. It is shown that the calculated width converges exponentially as the effective principal quantum number of the Rydberg orbital increases. The threshold capture width is then easily obtained. Since atmospheric recombination involves very low energy electrons, the threshold capture widths are essential to the calculation of DR cross sections for the atmospheric species studied here. The approach described makes use of bound state computer codes already in use. A program that collects width matrix elements over CI wavefunctions for the initial and final states is described.
50 CFR 216.93 - Tracking and verification program.
Code of Federal Regulations, 2012 CFR
2012-10-01
... in the 50 states, Puerto Rico, or American Samoa receives a domestic or imported shipment of ETP..., dressed, gilled and gutted, other), weight in short tons to the fourth decimal, ocean area of capture (ETP...
50 CFR 216.93 - Tracking and verification program.
Code of Federal Regulations, 2010 CFR
2010-10-01
... in the 50 states, Puerto Rico, or American Samoa receives a domestic or imported shipment of ETP..., dressed, gilled and gutted, other), weight in short tons to the fourth decimal, ocean area of capture (ETP...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-24
... Gateway National Recreation Area. Lethal methods could include shooting or euthanasia in conjunction with..., such as shooting, euthanasia, live capture, relocation, etc.? Will the management strategies mentioned...
ERIC Educational Resources Information Center
Chevalier, Cheryl; Pippen, Mary H.; Stevens, Dorothy
2008-01-01
The authors describe a hands-on program that they developed after their attendance at the NCTM Algebra Academy. The article explains how to use literature to capture youngsters' attention and engage them in interactive mathematical activities.
40 CFR 51.362 - Motorist compliance enforcement program oversight.
Code of Federal Regulations, 2010 CFR
2010-07-01
... collection through the use of automatic data capture systems such as bar-code scanners or optical character... determination of compliance through parking lot surveys, road-side pull-overs, or other in-use vehicle...
40 CFR 51.362 - Motorist compliance enforcement program oversight.
Code of Federal Regulations, 2011 CFR
2011-07-01
... collection through the use of automatic data capture systems such as bar-code scanners or optical character... determination of compliance through parking lot surveys, road-side pull-overs, or other in-use vehicle...
Hayward Youth-Based Trash Capture, Reduction, and Watershed Education Project
Information about the SFBWQP City of Hayward, CA Trash Managment Project, part of an EPA competitive grant program to improve SF Bay water quality focused on restoring impaired waters and enhancing aquatic resources.
Innovations in dynamic test restraint systems
NASA Technical Reports Server (NTRS)
Fuld, Christopher J.
1990-01-01
Recent launch system development programs have led to a new generation of large scale dynamic tests. The variety of test scenarios share one common requirement: restrain and capture massive high velocity flight hardware with no structural damage. The Space Systems Lab of McDonnell Douglas developed a remarkably simple and cost effective approach to such testing using ripstitch energy absorbers adapted from the sport of technical rockclimbing. The proven system reliability of the capture system concept has led to a wide variety of applications in test system design and in aerospace hardware design.