Final Technical Report - Center for Technology for Advanced Scientific Component Software (TASCS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sussman, Alan
2014-10-21
This is a final technical report for the University of Maryland work in the SciDAC Center for Technology for Advanced Scientific Component Software (TASCS). The Maryland work focused on software tools for coupling parallel software components built using the Common Component Architecture (CCA) APIs. Those tools are based on the Maryland InterComm software framework that has been used in multiple computational science applications to build large-scale simulations of complex physical systems that employ multiple separately developed codes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Svetlana Shasharina
The goal of the Center for Technology for Advanced Scientific Component Software is to fundamentally changing the way scientific software is developed and used by bringing component-based software development technologies to high-performance scientific and engineering computing. The role of Tech-X work in TASCS project is to provide an outreach to accelerator physics and fusion applications by introducing TASCS tools into applications, testing tools in the applications and modifying the tools to be more usable.
Center for Center for Technology for Advanced Scientific Component Software (TASCS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kostadin, Damevski
A resounding success of the Scientific Discovery through Advanced Computing (SciDAC) program is that high-performance computational science is now universally recognized as a critical aspect of scientific discovery [71], complementing both theoretical and experimental research. As scientific communities prepare to exploit unprecedented computing capabilities of emerging leadership-class machines for multi-model simulations at the extreme scale [72], it is more important than ever to address the technical and social challenges of geographically distributed teams that combine expertise in domain science, applied mathematics, and computer science to build robust and flexible codes that can incorporate changes over time. The Center for Technologymore » for Advanced Scientific Component Software (TASCS)1 tackles these these issues by exploiting component-based software development to facilitate collaborative high-performance scientific computing.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Malony, Allen D; Shende, Sameer
The primary goal of the University of Oregon's DOE "ÃÂcompetitiveness" project was to create performance technology that embodies and supports knowledge of performance data, analysis, and diagnosis in parallel performance problem solving. The target of our development activities was the TAU Performance System and the technology accomplishments reported in this and prior reports have all been incorporated in the TAU open software distribution. In addition, the project has been committed to maintaining strong interactions with the DOE SciDAC Performance Engineering Research Institute (PERI) and Center for Technology for Advanced Scientific Component Software (TASCS). This collaboration has proved valuable for translationmore » of our knowledge-based performance techniques to parallel application development and performance engineering practice. Our outreach has also extended to the DOE Advanced CompuTational Software (ACTS) collection and project. Throughout the project we have participated in the PERI and TASCS meetings, as well as the ACTS annual workshops.« less
EGRET High Energy Capability and Multiwavelength Flare Studies and Solar Flare Proton Spectra
NASA Technical Reports Server (NTRS)
Chupp, Edward L.
1997-01-01
UNH was assigned the responsibility to use their accelerator neutron measurements to verify the TASC response function and to modify the TASC fitting program to include a high energy neutron contribution. Direct accelerator-based measurements by UNH of the energy-dependent efficiencies for detecting neutrons with energies from 36 to 720 MeV in NaI were compared with Monte Carlo TASC calculations. The calculated TASC efficiencies are somewhat lower (by about 20%) than the accelerator results in the energy range 70-300 MeV. The measured energy-loss spectrum for 207 MeV neutron interactions in NaI were compared with the Monte Carlo response for 200 MeV neutrons in the TASC indicating good agreement. Based on this agreement, the simulation was considered to be sufficiently accurate to generate a neutron response library to be used by UNH in modifying the TASC fitting program to include a neutron component in the flare spectrum modeling. TASC energy-loss data on the 1991 June 11 flare was transferred to UNH. Also included appendix: Gamma-rays and neutrons as a probe of flare proton spectra: the solar flare of 11 June 1991.
TASC II and the endovascular management of infrainguinal disease.
Lyden, Sean P; Smouse, H Bob
2009-04-01
The stratifications of aortoiliac, femoropopliteal, and infrapopliteal lesions included in the original comprehensive report of the TransAtlantic Inter-Society Consensus (TASC I) have been commonly used to formally characterize clinical trial populations and to channel investigative discussion among clinicians, while the associated treatment recommendations have become outdated as compared to current clinical practice. The TASC II report is an abbreviated update focusing on key areas of diagnosis and management of peripheral artery disease, with revised stratifications of aortoiliac and femoropopliteal lesions but not infrapopliteal disease. The consensus document keeps new lesion stratifications linked to the same structure of recommendations for initial treatment: endovascular for type A, endovascular (with qualifications) for type B, open surgical (with qualifications) for type C, and open surgical for type D. In general, each TASC II lesion category includes more severe disease than in TASC I, but the TASC II report does not recommend specific endovascular modalities for infrainguinal occlusive disease. We discuss how the new TASC II femoropopliteal lesion categories reflect current research outcomes and clinical practice, including summarized results from some more recent studies that have demonstrated the ability to treat by endovascular means increasingly complex femoropopliteal lesions that would actually be classifiable as type C. Noting that TASC II does not include a separate stratification of infrapopliteal lesions, as did TASC I, we review evidence of recent endovascular treatment of infrapopliteal lesions and contend that TASC classifications in this anatomical area should be upgraded.
Shared Features of High-Performing After-School Programs: A Follow-Up to the TASC Evaluation
ERIC Educational Resources Information Center
Birmingham, Jennifer; Pechman, Ellen M.; Russell, Christina A.; Mielke, Monica
2005-01-01
This study examined high-performing after-school projects funded by The After-School Corporation (TASC), to determine what characteristics, if any, these projects shared. Evaluators reanalyzed student performance data collected during the multi-year evaluation of the TASC initiative to identify projects where the after-school program was…
The TASC Wheel Supports a Honey Bee Challenge
ERIC Educational Resources Information Center
Seeley, Claire
2011-01-01
The concept of TASC (Thinking Actively in a Social Context) was created by Belle Wallace (Wallace et al., 1993) as a model that can be used to nurture and develop thinking skills. As children work through the TASC wheel, the teacher has a very good opportunity to facilitate explicit conversations about thinking. This allows the children to grow in…
1995-05-01
ForeSight ( MECFS ). Prior to joining TASC, Mr. Stanzione served as the deputy director of the Semi-Automated Forces group at Loral Advanced Distributed...TASC’s other Synthetic Environment programs, including Weather in DIS (WINDS) and Multi-Echelon CFOR with ForeSight ( MECFS ). Prior to joining TASC, Mr
ERIC Educational Resources Information Center
Council of Chief State School Officers, 2013
2013-01-01
The Council of Chief State School Officers (CCSSO), through its Interstate Teacher Assessment and Support Consortium (InTASC), offers this set of combined resources that define and support ongoing teacher effectiveness to ensure students reach college and career ready standards. This document includes the "InTASC Model Core Teaching…
Outcomes of endovascular interventions for TASC II B and C femoropopliteal lesions.
Baril, Donald T; Marone, Luke K; Kim, Justine; Go, Michael R; Chaer, Rabih A; Rhee, Robert Y
2008-09-01
To evaluate outcomes of endovascular interventions on femoropopliteal occlusive disease and determine predictors of restenosis of Trans Atlantic Inter-Societal Consensus (TASC) II B and C lesions. All patients undergoing endovascular interventions for femoropopliteal occlusive disease between May 2003 and July 2007 were reviewed. Patient demographics, pre- and post-procedure ankle-brachial indices (ABI), and anatomic factors (including categorization by TASC II classification, lesion length, and runoff vessel status) were analyzed. Outcomes evaluated included freedom from restenoses, freedom from re-intervention, overall patency, and assisted-patency. A total of 237 total limbs were treated during the period reviewed. The study group included 108 TASC B and 32 TASC C limbs in 125 patients (mean age 73.1 +/- 10.4 years, male sex: 59%). Seventy-one percent of patients were Rutherford classification 2/3 while the remaining 29% were Rutherford classification 4/5. Mean follow-up period was 12.7 months (range, 1-52 m). Forty-one (41) limbs experienced restenosis or occlusion at a mean time of 8 months (range, 1-24 m). Freedom from restenosis/occlusion was 58.9% at 12 months and 47.9% at 24 months. Predictors of restenosis included a preoperative ABI <0.5 (hazard ratio [HR] 3.05, 95% confidence interval [CI] 1.36-6.86, P = .007) and hypercholesterolemia (HR 2.42, 95% CI 1.11-5.25, P = .025). Lesion length as a continuous variable (per centimeter) also correlated with a higher risk of restenosis (HR 1.06, 95% CI 1.00-1.12, P = .057). The overall assisted-primary and secondary-patency rates were 87% and 94% respectively at 3 years with no significant differences between TASC B and TASC C limbs. Endovascular interventions for TASC II B and C lesions are associated with restenosis/occlusion rates that are at least as good as those of open femoropopliteal bypass surgery from historical, previously published series. Furthermore, overall assisted-patency rates are excellent, although low preoperative ABIs continue to be associated with worse outcomes.
Jia, Cheng; Hu, Yu; Kelly, Derek; Kim, Junhyong; Li, Mingyao; Zhang, Nancy R
2017-11-02
Recent technological breakthroughs have made it possible to measure RNA expression at the single-cell level, thus paving the way for exploring expression heterogeneity among individual cells. Current single-cell RNA sequencing (scRNA-seq) protocols are complex and introduce technical biases that vary across cells, which can bias downstream analysis without proper adjustment. To account for cell-to-cell technical differences, we propose a statistical framework, TASC (Toolkit for Analysis of Single Cell RNA-seq), an empirical Bayes approach to reliably model the cell-specific dropout rates and amplification bias by use of external RNA spike-ins. TASC incorporates the technical parameters, which reflect cell-to-cell batch effects, into a hierarchical mixture model to estimate the biological variance of a gene and detect differentially expressed genes. More importantly, TASC is able to adjust for covariates to further eliminate confounding that may originate from cell size and cell cycle differences. In simulation and real scRNA-seq data, TASC achieves accurate Type I error control and displays competitive sensitivity and improved robustness to batch effects in differential expression analysis, compared to existing methods. TASC is programmed to be computationally efficient, taking advantage of multi-threaded parallelization. We believe that TASC will provide a robust platform for researchers to leverage the power of scRNA-seq. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.
Jia, Cheng; Hu, Yu; Kelly, Derek; Kim, Junhyong
2017-01-01
Abstract Recent technological breakthroughs have made it possible to measure RNA expression at the single-cell level, thus paving the way for exploring expression heterogeneity among individual cells. Current single-cell RNA sequencing (scRNA-seq) protocols are complex and introduce technical biases that vary across cells, which can bias downstream analysis without proper adjustment. To account for cell-to-cell technical differences, we propose a statistical framework, TASC (Toolkit for Analysis of Single Cell RNA-seq), an empirical Bayes approach to reliably model the cell-specific dropout rates and amplification bias by use of external RNA spike-ins. TASC incorporates the technical parameters, which reflect cell-to-cell batch effects, into a hierarchical mixture model to estimate the biological variance of a gene and detect differentially expressed genes. More importantly, TASC is able to adjust for covariates to further eliminate confounding that may originate from cell size and cell cycle differences. In simulation and real scRNA-seq data, TASC achieves accurate Type I error control and displays competitive sensitivity and improved robustness to batch effects in differential expression analysis, compared to existing methods. TASC is programmed to be computationally efficient, taking advantage of multi-threaded parallelization. We believe that TASC will provide a robust platform for researchers to leverage the power of scRNA-seq. PMID:29036714
EGRET High Energy Capability and Multiwavelength Flare Studies and Solar Flare Proton Spectra
NASA Technical Reports Server (NTRS)
Chupp, Edward L.
1998-01-01
The accomplishments of the participation in the Compton Gamma Ray Observatory Guest investigator program is summarized in this report. The work involved the study of Energetic Gamma Ray Experiment Telescope (EGRET)/Total Absorption Shower Counter(TASC) flare data. The specific accomplishments were the use of the accelerator neutron measurements obtained at the University of New Hampshire to verify the TASC response function and to modify the TASC fitting program to include a high energy neutron contribution, and to determine a high energy neutron contribution to the emissions from the 1991 June 11, solar flare. The next step in the analysis of this event was doing fits to the TASC energy-loss spectra as a function of time. A significant hardening of the solar proton spectrum over time was found for the flare. Further data was obtained from the Yohkoh HXT time histories and images for the 1991 October 27 flare. The results to date demonstrate that the TASC spectral analysis contributes crucial information on the particle spectrum interacting at the Sun. The report includes a paper accepted for publication, a draft of a paper to be delivered at the 26th International Cosmic Ray Conference and an abstract of a paper to be presented at the Meeting of the American Physical Society.
Freckmann, Anneka; Hines, Monique; Lincoln, Michelle
2017-06-01
To investigate the face validity of a measure of therapeutic alliance for paediatric speech-language pathology and to determine whether a difference exists in therapeutic alliance reported by speech-language pathologists (SLPs) conducting face-to-face sessions, compared with telepractice SLPs or in their ratings of confidence with technology. SLPs conducting telepractice (n = 14) or face-to-face therapy (n = 18) completed an online survey which included the Therapeutic Alliance Scales for Children - Revised (TASC-r) (Therapist Form) to rate clinicians' perceptions of rapport with up to three clients. Participants also reported their overall perception of rapport with each client and their comfort with technology. There was a strong correlation between TASC-r total scores and overall ratings of rapport, providing preliminary evidence of TASC-r face validity. There was no significant difference between TASC-r scores for telepractice and face-to-face therapy (p = 0.961), nor face-to-face and telepractice SLPs' confidence with familiar (p = 0.414) or unfamiliar technology (p = 0.780). The TASC-r may be a promising tool for measuring therapeutic alliance in speech-language pathology. Telepractice does not appear to have a negative effect on rapport between SLPs and paediatric clients. Future research is required to identify how SLPs develop rapport in telepractice.
The association of statin therapy with the primary patency of femoral and popliteal artery stents.
de Grijs, Derek; Teixeira, Pedro; Katz, Steven
2018-05-01
It has long been known that hydroxymethylglutaryl-coenzyme A reductase inhibitors (statins) broadly reduce cardiovascular events in patients with peripheral vascular disease. It was the goal of this study to determine whether there is an association between statin therapy and primary patency after stenting of superficial femoral and popliteal arteries. The records of all patients undergoing primary nitinol stenting of the femoral and popliteal arteries at a single institution and by a single surgeon during a 10-year period were reviewed. Demographic characteristics of the patients and risk factors were identified. TransAtlantic Inter-Society Consensus (TASC II) classifications were determined for all stented lesions. Analysis was performed to determine whether the use of statins at the time of stent placement was associated with a change in rates of primary patency. Loss of primary patency was said to have occurred when an intrastent occlusion or a ≥70% stenosis was identified by arterial duplex ultrasound or angiography. Kaplan-Meier survival curves were plotted, and differences between groups were tested by log-rank method. Between 2004 and 2014, primary femoral or popliteal stenting was performed on 308 limbs in 250 patients. At the time of intervention, 52.4% of these patients were being treated with statin therapy; 137 interventions were done for claudication and 113 for critical limb ischemia. Of the lesions treated, 165 were TASC A or B and 85 were TASC C or D. Primary patency rates for all stented lesions were 75%, 54%, and 35% at 12, 24, and 36 months. The patency rates at 12, 24, and 36 months, respectively, were 80%, 55%, and 40% for those taking statins and 68%, 49%, and 28% for those not taking statins (P = .178). Statin therapy demonstrated a trend toward an association with improved primary patency rates in TASC A/B lesions but had no association in TASC C/D lesions (TASC A/B, P = .056; TASC C/D, P = .537). Statin compliance was found to be 87% at a mean follow-up of 24.1 months. Although the use of statins has been shown to reduce cardiovascular morbidity and mortality in patients with peripheral vascular disease, overall there is not an association of these drugs with improved primary patency after primary stenting of femoral and popliteal artery lesions. However, when limbs are stratified for severity, less severe (TASC A/B) lesions demonstrated a trend toward a significant association between statin use and improved primary patency. This finding was not seen in more severe (TASC C/D) disease. Copyright © 2017 Society for Vascular Surgery. Published by Elsevier Inc. All rights reserved.
Time-aware service-classified spectrum defragmentation algorithm for flex-grid optical networks
NASA Astrophysics Data System (ADS)
Qiu, Yang; Xu, Jing
2018-01-01
By employing sophisticated routing and spectrum assignment (RSA) algorithms together with a finer spectrum granularity (namely frequency slot) in resource allocation procedures, flex-grid optical networks can accommodate diverse kinds of services with high spectrum-allocation flexibility and resource-utilization efficiency. However, the continuity and the contiguity constraints in spectrum allocation procedures may always induce some isolated, small-sized, and unoccupied spectral blocks (known as spectrum fragments) in flex-grid optical networks. Although these spectrum fragments are left unoccupied, they can hardly be utilized by the subsequent service requests directly because of their spectral characteristics and the constraints in spectrum allocation. In this way, the existence of spectrum fragments may exhaust the available spectrum resources for a coming service request and thus worsens the networking performance. Therefore, many reactive defragmentation algorithms have been proposed to handle the fragmented spectrum resources via re-optimizing the routing paths and the spectrum resources for the existing services. But the routing-path and the spectrum-resource re-optimization in reactive defragmentation algorithms may possibly disrupt the traffic of the existing services and require extra components. By comparison, some proactive defragmentation algorithms (e.g. fragmentation-aware algorithms) were proposed to suppress spectrum fragments from their generation instead of handling the fragmented spectrum resources. Although these proactive defragmentation algorithms induced no traffic disruption and required no extra components, they always left the generated spectrum fragments unhandled, which greatly affected their efficiency in spectrum defragmentation. In this paper, by comprehensively considering the characteristics of both the reactive and the proactive defragmentation algorithms, we proposed a time-aware service-classified (TASC) spectrum defragmentation algorithm, which simultaneously employed proactive and reactive mechanisms in suppressing spectrum fragments with the awareness of services' types and their duration times. By dividing the spectrum resources into several flexible groups according to services' types and limiting both the spectrum allocation and the spectrum re-tuning for a certain service inside one specific spectrum group according to its type, the proposed TASC defragmentation algorithm cannot only suppress spectrum fragments from generation inside each spectrum group, but also handle the fragments generated between two adjacent groups. In this way, the proposed TASC algorithm gains higher efficiency in suppressing spectrum fragments than both the reactive and the proactive defragmentation algorithms. Additionally, as the generation of spectrum fragments is retrained between spectrum groups and the defragmentation procedure is limited inside each spectrum group, the induced traffic disruption for the existing services can be possibly reduced. Besides, the proposed TASC defragmentation algorithm always re-tunes the spectrum resources of the service with the maximum duration time first in spectrum defragmentation procedure, which can further reduce spectrum fragments because of the fact that the services with longer duration times always have higher possibility in inducing spectrum fragments than the services with shorter duration times. The simulation results show that the proposed TASC defragmentation algorithm can significantly reduce the number of the generated spectrum fragments while improving the service blocking performance.
Improving Adolescent Learning: An Action Agenda. A TASC Report
ERIC Educational Resources Information Center
Duffrin, Elizabeth
2014-01-01
At a recent national forum at the Ford Foundation in New York, 140 education and youth development professionals discussed how to better support adolescent learning. Drawing on the discussion and the latest research in neuroscience, psychology and cognitive learning science, TASC presents an action agenda that can be tailored to circumstances in…
Rethinking edTPA: The Use of InTASC Principles and Standards
ERIC Educational Resources Information Center
Kuo, Nai-Cheng
2018-01-01
The Interstate Teacher Assessment and Support Consortium (InTASC) Model Core Teaching Standards and Learning Progressions for Teachers 1.0, developed by the Council of Chief State School Officers (CCSSO, 2013) in the United States, provide a set of expectations for essential knowledge, critical disposition, and performance needed for high-quality…
Tips for Finding the Right Partner. A TASC Resource Guide
ERIC Educational Resources Information Center
ExpandED Schools, 2014
2014-01-01
ExpandED Schools (formerly TASC) has spent the past 17 years helping schools and community organizations find the right partners. This resource guide offers a guide to the strategic questions school teams might ask themselves and their potential partners to have the best outcomes for young people. The following attachment is included: School…
ERIC Educational Resources Information Center
Sinclair, Beth; Russell, Christina A.; McCann, Colleen; Hildreth, Jeanine L.
2014-01-01
Policy Studies Associates (PSA) is conducting a five-year evaluation of the implementation and impact of the national demonstration of a model for expanded learning time developed by "The After-School Corporation" (TASC). This model, called "ExpandED Schools," aims to transform the educational experiences of students in ways…
Brouillet, Julie; Deloose, Koen; Goueffic, Yann; Poirier, Mathieu; Midy, Dominique; Caradu, Caroline; Ducasse, Eric
2018-06-01
Recent advances in endovascular techniques have made it a seductive choice in the management of TASC C and D lesions. Currently, this tendency remains controversial, despite high success rates. The aim of the study was to regroup and harmonize the results of three surgical teams in 5 centers in order to obtain the largest series ever published on TASC C and D femoro-popliteal lesions primary stenting. Two hundred and three patients and 209 lower limbs were included from March 2008 to October 2013. Each patient underwent primary stenting for TASC C or D femoro-popliteal lesions. Mean age was 70±10; 71.4% were male with a 39.8% rate of coronary heart disease, 20.1% of renal insufficiency and 35.9% of diabetes; 57,4% suffered from claudication and 42.6% from critical limb ischemia (CLI); 61.8% of the 144 limbs analyzed for run-off presented with 3 patent infra-popliteal arteries. Four hundred and three stents were implanted in the 209 limbs included. Median stented length was 252 mm. Associated procedures were performed in 35 patients (17.0%) including 4.3% minor amputations. The 30-day mortality rate was 1.4% (3 patients). Major complications occurred in 19 patients (9.3%) including 7 patients (3.4%) presented with early in-stent thrombosis. Median follow-up duration was 12 months (range 9.5-17.2 months). The 12-month mortality rate was 11.8% (24 patients). The 3, 6 and 12 months primary patency rates according to Kaplan Meier estimates were 98.1±0.9, 85.2±2.5 and 67.0±3.3% respectively. Secondary patency rates were 96.1±1.9, 89.3±3.0 and 75.7±4.2% respectively. A subgroup analysis reported significantly higher patency rates for TASC C lesions compared to TASC D lesions (82.1% vs. 44% respectively, P=0.009). The 12-month in-stent thrombosis and restenosis rates were 19.6% and 13.9% respectively. A subgroup analysis showed higher rates of in-stent restenosis for TASC D lesions compared to TASC C lesions (35% vs. 10% respectively, P=0.005). The stent fracture rate was equal to 10.2% (30 stents). Occurrence of in-stent thrombosis and restenosis were associated with 3 and 5 cases of stent fracture (type II to IV) respectively. Freedom from TLR was 70.5%. Rutherford class decreased from 3.7 to 0.9 (3.52±1.06 to 0.75±1.24) (P<0.0001). At 12 months 61.3% were asymptomatic, 33.3% suffered from claudication (21.3% Rutherford 1) and 5.4% suffered from CLI. Healing rates were 63.9% with a limb salvage rate of 95.5% and a major amputation rate of 3.8%. This is the largest series of TASC C and D femoro-popliteal lesions primary stenting to our knowledge. The results are encouraging with acceptable primary patency and clinical improvement at 12 months. Results from mid- to long-term follow-up are awaited.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-23
... (TASC) program. The intended effect of this notice is to solicit applications from the private sector... funding authority for TASC expires at the end of fiscal year 2013. This notice is being published at this time to allow awards to be made early in fiscal year 2014, provided that program funding is...
Time to Grow: Year Two Report on ExpandED Schools. A TASC Report
ERIC Educational Resources Information Center
Traill, Saskia; Brohawn, Katie
2014-01-01
An analysis of data from the second year of The After-School Corporation's (TASC's) national demonstration of an expanded school day for elementary and middle school students shows that ExpandED Schools improved school culture, decreased rates of students' chronic absenteeism and helped students develop positive learning habits and attitudes.…
Avoiding the Attendance Slump: Strategies to Maximize Learning Time in June. A TASC Resource Guide
ERIC Educational Resources Information Center
Traill, Saskia; Brohawn, Katie
2014-01-01
The After-School Corporation (TASC) works to build education enrichment into an expanded school day because extra time spent in engaging learning activities leads to better outcomes in school and beyond. There's one month, however, when students in many schools lose learning time: June. In 2013, NYC elementary and middle schools saw their average…
ERIC Educational Resources Information Center
ExpandED Schools, 2014
2014-01-01
This guidebook was prepared by TASC (The After-School Corporation) and their Frontiers in Urban Science Education (FUSE) programs. FUSE is TASC's initiative to help more out-of-school-time programs and expanded learning time schools offer kids engaging, exciting and inspiring activities that promote science inquiry. The guidebook offers a a…
ERIC Educational Resources Information Center
Traill, Saskia; Brohawn, Katie
2015-01-01
In the 2013-14 school year, TASC entered the third year of its national demonstration of ExpandED Schools. Ten elementary and middle schools in New York City, Baltimore and New Orleans continued their partnerships with youth-serving community organizations, such as settlement houses or community development corporations. Together, principals,…
Elmahdy, Mahmoud Farouk; Buonamici, Piergiovanni; Trapani, Maurizio; Valenti, Renato; Migliorini, Angela; Parodi, Guido; Antoniucci, David
2017-06-01
Endovascular therapy for long femoropopliteal lesions using percutaneous transluminal balloon angioplasty or first-generation of peripheral stents has been associated with unacceptable one-year restenosis rates. However, with recent advances in equipment and techniques, a better primary patency rate is expected. This study was conducted to detect the long-term primary patency rate of nitinol self-expandable stents implanted in long, totally occluded femoropopliteal lesions TransAtlantic Inter-Society Census (TASC II type C & D), and determine the predictors of reocclusion or restenosis in the stented segments. The demographics, clinical, anatomical, and procedural data of 213 patients with 240 de novo totally occluded femoropopliteal (TASC II type C & D) lesions treated with nitinol self-expandable stents were retrospectively analysed. Of these limbs, 159 (66.2%) presented with intermittent claudication, while 81 (33.8%) presented with critical limb ischaemia. The mean-time of follow-up was 36±22.6 months, (range: 6.3-106.2 months). Outcomes evaluated were, primary patency rate and predictors of reocclusion or restenosis in the stented segments. The mean age of the patients was 70.9±9.3 years, with male gender 66.2%. Mean pre-procedural ABI was 0.45±0.53. One-hundred-and-seventy-five (73%) lesions were TASC II type C, while 65 (27%) were type D lesions. The mean length of the lesions was 17.9±11.3mm. Procedure related complications occurred in 10 (4.1%) limbs. There was no periprocedural mortality. Reocclusion and restenosis were detected during follow-up in 45 and 30 limbs respectively, and all were re-treated by endovascular approach. None of the patients required major amputation. Primary patency rates were 81.4±1.1%, 77.7±1.9% and 74.4±2.8% at 12, 24, and 36 months respectively. Male gender, severe calcification, and TASC II D lesion were independent predictors for reocclusion, while predictors of restenosis were DM, smoking and TASC II D lesions. Treatment of long, totally occluded femoropopliteal (TASC II C & D) lesions with nitinol self-expandable stents is safe and is associated with highly acceptable long-term primary patency rates. Copyright © 2016 Australian and New Zealand Society of Cardiac and Thoracic Surgeons (ANZSCTS) and the Cardiac Society of Australia and New Zealand (CSANZ). Published by Elsevier B.V. All rights reserved.
Building Future Security: Strategies for Restructuring the Defense Technology and Industrial Base.
1992-06-01
Beardsley Headquarters Air Force Logistics Command Wright- Patterson AFB, Ohio Don Carson TASC Arlington, VA William Clark Defense Systems Management...Vice Chairman Senate EDWARD M. KENNEDY Massachusetts ERNEST F. HOLLINGS South Carolina CLAIBORNE PELL Rhode Island ORRIN G. HATCH Utah...President TASC Julius Harwood Consultant William W. Kaufmann Senior Fellow The Brookings Institution General P.X. Kelley USMC (Ret.) James L
DOE Office of Scientific and Technical Information (OSTI.GOV)
Torres-Blanco, Álvaro, E-mail: atorres658@yahoo.es; Edo-Fleta, Gemma; Gómez-Palonés, Francisco
2016-03-15
PurposeThe purpose of the study was to assess the safety and midterm effectiveness of endovascular treatment in Trans-Atlantic Inter-Society Consensus II (TASC-II) D femoropopliteal occlusions in patients with critical limb ischemia (CLI).MethodsPatients with CLI who underwent endovascular treatment for TASC-D de novo femoropopliteal occlusive disease between September 2008 and December 2013 were selected. Data included anatomic features, pre- and postprocedure ankle-brachial index, duplex ultrasound, and periprocedural complications. Sustained clinical improvement, limb salvage rate, freedom from target lesion revascularization (TLR), and freedom from target extremity revascularization (TER) were assessed by Kaplan–Meier estimation and predictors of restenosis/occlusion with Cox analysis.ResultsThirty-two patients underwentmore » treatment of 35 TASC-D occlusions. Mean age was 76 ± 9. Mean lesion length was 23 ± 5 cm. Twenty-eight limbs (80 %) presented tissue loss. Seventeen limbs underwent treatment by stent, 13 by stent-graft, and 5 by angioplasty. Mean follow-up was 29 ± 20 months. Seven patients required major amputation and six patients died during follow-up. Eighteen endovascular and three surgical TLR procedures were performed due to restenosis or occlusion. Estimated freedom from TLR and TER rates at 2 years were 41 and 76 %, whereas estimated primary and secondary patency rates were 41 and 79 %, respectively.ConclusionsEndovascular treatment for TASC II D lesions is safe and offers satisfying outcomes. This patient subset would benefit from a minimally invasive approach. Follow-up is advisable due to a high rate of restenosis. Further follow-up is necessary to know the long-term efficacy of these procedures.« less
Alhijjaj, Muqdad; Reading, Mike; Belton, Peter; Qi, Sheng
2015-11-03
Characterizing inter- and intrasample heterogeneity of solid and semisolid pharmaceutical products is important both for rational design of dosage forms and subsequent quality control during manufacture; however, most pharmaceutical products are multicomponent formulations that are challenging in this regard. Thermal analysis, in particular differential scanning calorimetry, is commonly used to obtain structural information, such as degree of crystallinity, or identify the presence of a particular polymorph, but the results are an average over the whole sample; it cannot directly provide information about the spatial distribution of phases. This study demonstrates the use of a new thermo-optical technique, thermal analysis by structural characterization (TASC), that can provide spatially resolved information on thermal transitions by applying a novel algorithm to images acquired by hot stage microscopy. We determined that TASC can be a low cost, relatively rapid method of characterizing heterogeneity and other aspects of structure. In the examples studied, it was found that high heating rates enabled screening times of 3-5 min per sample. In addition, this study demonstrated the higher sensitivity of TASC for detecting the metastable form of polyethylene glycol (PEG) compared to conventional differential scanning calorimetry (DSC). This preliminary work suggests that TASC will be a worthwhile additional tool for characterizing a broad range of materials.
Hua, W R; Yi, M Q; Min, T L; Feng, S N; Xuan, L Z; Xing, J
2013-08-01
This study aimed to ascertain differences in benefit and effectiveness of popliteal versus tibial retrograde access in subintimal arterial flossing with the antegrade-retrograde intervention (SAFARI) technique. This was a retrospective study of SAFARI-assisted stenting for long chronic total occlusion (CTO) of TASC C and D superficial femoral lesions. 38 cases had superficial femoral artery lesions (23 TASC C and 15 TASC D). All 38 cases underwent SAFARI-assisted stenting. The ipsilateral popliteal artery was retrogradely punctured in 17 patients. A distal posterior tibial (PT) or dorsalis pedis (DP) artery was retrogradely punctured in 21 patients, and 16 of them were punctured after open surgical exposure. SAFARI technical success was achieved in all cases. There was no significant difference in 1-year primary patency (75% vs. 78.9%, p = .86), secondary patency (81.2% vs. 84.2%, p = .91) and access complications (p = 1.00) between popliteal and tibial retrograde access. There was statistical difference in operation time between popliteal (140.1 ± 28.4 min) and tibial retrograde access with PT/DP punctures after surgical vessel exposure (120.4 ± 23.0 min, p = .04). The SAFARI technique is a safe and feasible option for patients with infrainguinal CTO (TASC II C and D). The PT or DP as the retrograde access after surgical vessel exposure is a good choice when using the SAFARI technique. Copyright © 2013 European Society for Vascular Surgery. Published by Elsevier Ltd. All rights reserved.
EGRET observations of bursts at MeV energies
NASA Astrophysics Data System (ADS)
Catelli, J. R.; Dingus, B. L.; Schneid, E. J.
1998-05-01
We present preliminary results from the analysis of 16 bright bursts that have been observed by the EGRET NaI calorimeter, or TASC. Seven bursts have been imaged in the EGRET spark chamber above 30 MeV, but in most cases the TASC data gives the highest energy spectra available for these bursts. The TASC can obtain spectral and rate information for bursts well outside the field of view of the EGRET spark chambers, and is sensitive in the energy range from 1 to 200 MeV. The spectra for these bursts are mostly consistent with a simple power law with spectral index in the range from 1.7 to 3.7, with several of the brighter bursts showing emission past 100 MeV. No high energy cutoff has been observed. These high energy photons offer important clues to the physical processes involved at the origin of burst emission. For bursts at cosmological distances extremely high bulk Lorentz factors are implied by the presence of MeV and GeV photons which have not been attenuated by pair production with the lower energy photons from the source.
Wu, Zhenyu; Zou, Ming
2014-10-01
An increasing number of users interact, collaborate, and share information through social networks. Unprecedented growth in social networks is generating a significant amount of unstructured social data. From such data, distilling communities where users have common interests and tracking variations of users' interests over time are important research tracks in fields such as opinion mining, trend prediction, and personalized services. However, these tasks are extremely difficult considering the highly dynamic characteristics of the data. Existing community detection methods are time consuming, making it difficult to process data in real time. In this paper, dynamic unstructured data is modeled as a stream. Tag assignments stream clustering (TASC), an incremental scalable community detection method, is proposed based on locality-sensitive hashing. Both tags and latent interactions among users are incorporated in the method. In our experiments, the social dynamic behaviors of users are first analyzed. The proposed TASC method is then compared with state-of-the-art clustering methods such as StreamKmeans and incremental k-clique; results indicate that TASC can detect communities more efficiently and effectively. Copyright © 2014 Elsevier Ltd. All rights reserved.
Lesion complexity drives the cost of superficial femoral artery endovascular interventions
Walker, Karen L.; Nolan, Brian W.; Columbo, Jesse A.; Rzucidlo, Eva M.; Goodney, Philip P.; Walsh, Daniel B.; Atkinson, Benjamin J.; Powell, Richard J.
2017-01-01
Objective Patients who undergo endovascular treatment of superficial femoral artery (SFA) disease vary greatly in lesion complexity and treatment options. This study examined the association of lesion severity and cost of SFA stenting and to determine if procedure cost affects primary patency at 1 year. Methods A retrospective record review identified patients undergoing initial SFA stenting between January 1, 2010, and February 1, 2012. Medical records were reviewed to collect data on demographics, comorbidities, indication for the procedure, TransAtlantic Inter-Society Consensus (TASC) II severity, and primary patency. The interventional radiology database and hospital accounting database were queried to determine cost drivers of SFA stenting. Procedure supply cost included any item with a bar code used for the procedure. Associations between cost drivers and lesion characteristics were explored. Primary patency was determined using Kaplan-Meier survival curves and a log-rank test. Results During the study period, 95 patients underwent stenting in 98 extremities; of these, 61% of SFA stents were performed for claudication, with 80% of lesions classified as TASC II A or B. Primary patency at 1 year was 79% for the entire cohort. The mean total cost per case was $10,333. Increased procedure supply cost was associated with adjunct device use, the number of stents, and TASC II severity. Despite higher costs of treating more complex lesions, primary patency at 1 year was similar at 80% for high-cost (supply cost >$4000) vs 78% for low-cost (supply cost <$4000) interventions. Conclusions SFA lesion complexity, as defined by TASC II severity, drives the cost of endovascular interventions but does not appear to disadvantage patency at 1 year. Reimbursement agencies should consider incorporating disease severity into reimbursement algorithms for lower extremity endovascular interventions. PMID:26206581
Lesion complexity drives the cost of superficial femoral artery endovascular interventions.
Walker, Karen L; Nolan, Brian W; Columbo, Jesse A; Rzucidlo, Eva M; Goodney, Philip P; Walsh, Daniel B; Atkinson, Benjamin J; Powell, Richard J
2015-10-01
Patients who undergo endovascular treatment of superficial femoral artery (SFA) disease vary greatly in lesion complexity and treatment options. This study examined the association of lesion severity and cost of SFA stenting and to determine if procedure cost affects primary patency at 1 year. A retrospective record review identified patients undergoing initial SFA stenting between January 1, 2010, and February 1, 2012. Medical records were reviewed to collect data on demographics, comorbidities, indication for the procedure, TransAtlantic Inter-Society Consensus (TASC) II severity, and primary patency. The interventional radiology database and hospital accounting database were queried to determine cost drivers of SFA stenting. Procedure supply cost included any item with a bar code used for the procedure. Associations between cost drivers and lesion characteristics were explored. Primary patency was determined using Kaplan-Meier survival curves and a log-rank test. During the study period, 95 patients underwent stenting in 98 extremities; of these, 61% of SFA stents were performed for claudication, with 80% of lesions classified as TASC II A or B. Primary patency at 1 year was 79% for the entire cohort. The mean total cost per case was $10,333. Increased procedure supply cost was associated with adjunct device use, the number of stents, and TASC II severity. Despite higher costs of treating more complex lesions, primary patency at 1 year was similar at 80% for high-cost (supply cost >$4000) vs 78% for low-cost (supply cost <$4000) interventions. SFA lesion complexity, as defined by TASC II severity, drives the cost of endovascular interventions but does not appear to disadvantage patency at 1 year. Reimbursement agencies should consider incorporating disease severity into reimbursement algorithms for lower extremity endovascular interventions. Copyright © 2015 Society for Vascular Surgery. Published by Elsevier Inc. All rights reserved.
Endovascular stents: a review of their use in peripheral arterial disease.
Kudagi, Vinod S; White, Christopher J
2013-06-01
Technological advances in the past decade have shifted revascularization strategies from traditional open surgical approaches toward lower-morbidity percutaneous endovascular treatments for patients with lower extremity peripheral arterial disease (PAD). The continuing advances in stent design, more than any other advances, have fueled the growth of catheter-based procedures by improving the safety, durability, and predictability of percutaneous revascularization. Although the 2007 TransAtlantic Inter-Society Consensus (TASC) guidelines recommend endovascular therapy for type A and B aortoiliac and femoropopliteal lesions, recent developments in stent technology and increased experience of interventionists have suggested that a strategy of endovascular therapy first is appropriate in experienced hands for TASC type D lesions. The role of endovascular interventions is also expanding in the treatment of limb-threatening ischemia.
7 CFR 1487.1 - What special definitions apply to the TASC program?
Code of Federal Regulations, 2010 CFR
2010-01-01
.... organization, including, but not limited to, U.S. government agencies, State government agencies, non-profit trade associations, universities, agricultural cooperatives, and private companies. FAS—Foreign...
Endovascular interventions for TASC II D femoropopliteal lesions.
Baril, Donald T; Chaer, Rabih A; Rhee, Robert Y; Makaroun, Michel S; Marone, Luke K
2010-06-01
Advances in endovascular techniques have provided new options in the treatment of complex infrainguinal occlusive lesions. The purpose of this study was to evaluate outcomes of endovascular interventions on TransAtlantic InterSociety (TASC) II D femoropopliteal occlusive disease. All patients undergoing endovascular interventions for femoropopliteal occlusive disease between July 2004 and July 2009 were reviewed. Patient demographics, pre- and postprocedure ankle-brachial indices (ABI) and anatomic factors were analyzed. Outcomes evaluated included primary patency, assisted-patency, secondary patency, predictors of restenosis, and wound healing. Five hundred eighty-five limbs were treated during the period reviewed. The study group included 79 TASC D limbs in 74 patients (mean age 76.5 +/- 11.9 years, male sex: 53%). Fifty-six limbs (71%) underwent treatment for critical limb ischemia, including 42 (53%) with tissue loss. Eleven patients (15%) had previous failed bypasses. Preoperative ABIs were unobtainable for 23 patients, while the remaining 56 had a mean baseline ABI of 0.54 +/- 0.28. There was one periprocedural mortality. Five patients (6.3%) had periprocedural complications. Mean increase in ABI postprocedure was 0.49 +/- 0.35. Follow-up was available for 74 limbs at a mean of 10.7 months (range, 1-35). There were 18 mortalities (24.3%) during the follow-up period. No patient required a major amputation during this follow-up period. Twenty-one limbs (26.6%) experienced restenosis and nine limbs (11.4%) experienced occlusion. Twenty-nine limbs underwent reintervention during the follow-up time, including nine which underwent multiple reinterventions. Primary, assisted-primary, and secondary patency rates at 12 and 24 months were 52.2%, 88.4%, 92.6% and 27.5%, 74.2%, and 88.9%, respectively. Predictors of restenosis/occlusion included hypercholesterolemia, the presence of a popliteal artery stent, and patients who were current or former smokers. Endovascular interventions for TASC II D lesions can be safely performed with excellent hemodynamic improvement and limb salvage rates. Restenosis is not uncommon in this population, which mandates strict follow-up. Further follow-up is necessary to determine the long-term efficacy of these interventions. Copyright (c) 2010 Society for Vascular Surgery. Published by Mosby, Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Wang, Qiuliang; Dai, Yinming; Zhao, Baozhi; Song, Souseng; Chen, Shunzhong; Yan, Luguang
2009-06-01
This article has been retracted: please see Elsevier Policy on Article Withdrawal (http://www.elsevier.com/locate/withdrawalpolicy). This article has been retracted at the request of the Editors. The article duplicates significant parts of a paper that had already appeared in IEEE TASC, VOL. 18 (2008) 548-551. 10.1109/TASC.2008.921295. One of the conditions of submission of a paper for publication is that authors declare explicitly that the paper is not under consideration for publication elsewhere. Re-use of any data should be appropriately cited. As such this article represents a severe abuse of the scientific publishing system. The scientific community takes a very strong view on this matter and apologies are offered to readers of the journal that this was not detected during the submission process.
The Portable War Room Research Project
NASA Technical Reports Server (NTRS)
Govers, Francis X., III; Fry, Mark
1997-01-01
The Portable War Room is an internal TASC project to research and develop a visualization and simulation environment to provide for decision makers the power to review the past, understand the present, and peer into the future.
Mwipatayi, Bibombe P; Sharma, Surabhi; Daneshmand, Ali; Thomas, Shannon D; Vijayan, Vikram; Altaf, Nishath; Garbowski, Marek; Jackson, Mark
2016-07-01
The Covered vs Balloon Expandable Stent Trial (COBEST) is the first multicenter trial to investigate the patency of covered stents (CSs) and bare-metal stents (BMSs) in the treatment of aortoiliac arterial disease. The short-term results demonstrated that CSs were superior to BMSs in maintaining patency for TransAtlantic Inter-Society Consensus (TASC) C and D lesions at 18 months and were equivalent to BMSs for TASC B lesions. The current study was conducted to determine if the initial patency advantage of CSs over BMSs was sustained at the 5-year follow-up. A retrospective post hoc analysis of COBEST was performed. Originally, 125 patients with 168 iliac arteries were prospectively enrolled and randomly assigned to receive a CS or BMS. In this study, 77 of the 125 patients (61.6%; 119 limbs) were assessed at 60 months for the primary and secondary end points, with particular attention paid to the outcomes stratified according to TASC lesion severity. The primary end point was the rate of binary stenosis or freedom from stent occlusion of the treated area, as determined by ultrasound imaging or quantitative visual angiography. The 5-year results of the COBEST showed that the CS had a significantly higher patency rate than the BMS at 18, 24, 48, and 60 months (95.1%, 82.1%, 79.9%, 74.7% for CS vs 73.9%, 70.9%, 63% and 62.5% for BMS; log-rank test, P = .01). On multivariate analysis, the type of stent used (hazard ratio [HR], 2.797; 95% confidence interval [CI], 1.471-5.318; P = .002) and the Rutherford classification (HR, 2.019; 95% CI, 1.278-3.191; P = .026) significantly affected the adjusted primary patency. On subgroup analysis, the CS showed significantly higher patency and a survival benefit compared with the BMS in TASC C and D lesions (HR, 8.639; 95% CI, 54.253-75.753; P = .003). Moreover, fewer patients received target limb revascularization in the CS group than in the BMS group (odds ratio, 2.32; 95% CI, 1.47-3.36; P = .02); however, there was no statistically significant difference in the rate of amputations between the groups. The 5-year results of the COBEST demonstrated that the CS has an enduring patency advantage over the BMS in both the short and long terms. Furthermore, the CS showed acceptable patency rates for the treatment of more severe TASC C and D lesions, and patients who received a CS required fewer revascularization procedures. However, the choice of stent did not affect the rate of major limb amputations. Copyright © 2016 Society for Vascular Surgery. Published by Elsevier Inc. All rights reserved.
Iliac artery angioplasty : technique and results.
Brountzos, E N; Kelekis, D A
2004-10-01
Percutaneous angioplasty is widely used for the treatment of iliac artery occlusive disease. Access to the ipsi-lateral, or less commonly contralateral, common femoral artery is obtained under local anaesthesia; the lesion is crossed with a guidewire and dilated with an angioplasty balloon catheter. This technique yields excellent immediate results with very few complications. Stent placement is used in lesions not amenable to balloon angioplasty, in complications, and recurrences. Evidence suggests that balloon angioplasty is the procedure of choice for iliac artery occlusive lesions. Stent placement should be reserved for angioplasty failures. However, primary stent placement is indicated in total occlusions. Lesion morphology is an important determinant of immediate success and long-term patency. TASC lesions type A and B are best treated with angioplasty and stenting, while TASC lesions type C and D show better results with surgical treatment. The development of new stent designs may expand the indications of the percutaneous treatment.
Benetis, Rimantas; Antusevas, Aleksandras; Kaupas, Rytis Stasys; Inciura, Donatas; Kinduris, Sarunas
2016-01-01
Introduction The priority use of endovascular techniques in the management of aortoiliac occlusive disease has increased in the last decade. The aim of the present article is to report 1- and 2-year results of iliac artery stenting (IAS) and aortoiliac grafting in the management of patients with TASC II type B, C and D iliac lesions and chronic limb ischaemia. Material and methods In this prospective, non-randomised, one-centre clinical study, iliac artery stents and vascular grafts used for the treatment of patients with symptomatic lesions in the iliac artery were evaluated. This study enrolled 2 groups: 54 patients in the stent group and 47 patient in the surgery group. Results The primary patency rates at 1 and 2 years were 83% and 79.9% after IAS and 97.1% and 97.1% after surgical reconstruction, respectively (p = 0.015). The assisted primary stent patency at 1 and 2 years was 87.9% and 78.2%, respectively. The complication rate was 7.4% in the stent group and 6.3% in the surgery group. There was no perioperative mortality in either group. Conclusions Our results reveal that patients with severe aortoiliac occlusive disease (TASC II types B, C and D) can be treated with IAS or surgically with satisfactory results. Iliac artery stenting is associated with decreased primary patency compared with the surgery group. Iliac artery stenting should be considered with priority in elderly patients or in patients with severe comorbidities. PMID:27186180
Bradbury, Andrew W; Adam, Donald J; Bell, Jocelyn; Forbes, John F; Fowkes, F Gerry R; Gillespie, Ian; Ruckley, Charles Vaughan; Raab, Gillian M
2010-05-01
The Bypass versus Angioplasty in Severe Ischaemia of the Leg (BASIL) trial showed in patients with severe lower limb ischemia (rest pain, tissue loss) who survive for 2 years after intervention that initial randomization to bypass surgery, compared with balloon angioplasty, was associated with an improvement in subsequent amputation-free survival and overall survival of about 6 and 7 months, respectively. The aim of this report is to describe the angiographic severity and extent of infrainguinal arterial disease in the BASIL trial cohort so that the trial outcomes can be appropriately generalized to other patient cohorts with similar anatomic (angiographic) patterns of disease. Preintervention angiograms were scored using the Bollinger method and the TransAtlantic Inter-Society Consensus (TASC) II classification system by three consultant interventional radiologists and two consultant vascular surgeons unaware of the treatment received or patient outcomes. As was to be expected from the randomization process, patients in the two trial arms were well matched in terms of angiographic severity and extent of disease as documented by Bollinger and TASC II. In patients with the least overall disease, it tended to be concentrated in the superficial femoral and popliteal arteries, which were the commonest sites of disease overall. The below knee arteries became increasingly involved as the overall severity of disease increased, but the disease in the above knee arteries did not tend to worsen. The posterior tibial artery was the most diseased crural artery, whereas the peroneal appeared relatively spared. There was less interobserver disagreement with the Bollinger method than with the TASC II classification system, which also appears inherently less sensitive to clinically important differences in infrapopliteal disease among patients with severe leg ischemia. Anatomic (angiographic) disease description in patients with severe leg ischemia requires a reproducible scoring system that is sensitive to differences in crural artery disease. The Bollinger system appears well suited for this purpose, but the TASC II classification system less so. We hope this detailed analysis will facilitate appropriate generalization of the BASIL trial data to other groups of patients affected by similar anatomic (angiographic) patterns of disease. Crown Copyright (c) 2010. Published by Mosby, Inc. All rights reserved.
Optical Research and Field Services
2004-10-01
effects found during retinal, corneal and lenticular exposures. AFRL/HEDO personnel and TASC team members traveled to Albuquerque, NM and White Sands... astigmatism , thus reducing dependence on spectacles or contact lenses. The aeromedical and military implications of this new technology are immense. With
DOT National Transportation Integrated Search
2012-08-01
In previous phases of this research, we developed a methodology for surveying transit riders about their levels of satisfaction and how : important they find various attributes at transit stops and stations. We applied an Importance-Satisfaction Anal...
Khan, Sikandar Z; Rivero, Mariel; Cherr, Gregory S; Harris, Linda M; Dryjski, Maciej L; Dosluoglu, Hasan H
2018-05-15
Infrainguinal revascularization for disabling claudication (DC) is frequently performed, but long-term results are still unknown. In this study, we compared clinical outcomes of infrainguinal endovascular (EV) and open interventions for DC after the failure of medical management. One hundred ninety-four patients with DC (Rutherford category 3) who had open (n = 53) or EV (n = 141) interventions were grouped as open-great saphenous vein (GSV) (n = 21), open-prosthetic (n = 32), EV-Trans-Atlantic Inter-Society Consensus II (TASC II) A and B (AB) (n = 48), and EV-TASC II C and D (CD) (n = 93). Patency, primary clinical success (PCS; sustained improvement in symptoms without reintervention), and secondary clinical success (SCS; sustained improvement in symptoms with reintervention) rates were compared. Mean follow-up was 57 ± 33 months. Five-year primary patency was 58% in open-GSV, 40% in open-prosthetic, 72% in EV-AB, and 38% in EV-CD (P < 0.001). Five-year secondary patency was 77% in open-GSV, 50% in open-prosthetic, 96% in EV-AB, and 61% in EV-CD (P < 0.001). Freedom from major adverse limb events was 73% in open-GSV, 77% in EV-AB, 70% in EV-CD, and 67% in open-prosthetic (P = 0.279). Five-year PCS was 46% in open-GSV, 40% in open-prosthetic, 57% in EV-AB, and 44% in EV-CD (P = 0.02). Five-year SCS was 78% in open-GSV, 78% in open-prosthetic, 85% in EV-AB, and 84% in EV-CD (P = 0.732). A total of 116 reinterventions were performed, 10 in 6 limbs (27%) in open-GSV, 18 in 12 limbs (36%) in open-prosthetic, 26 in 15 limbs (24%) in EV-AB, and 62 in 39 limbs (36%) in EV-CD. Reinterventions included 71 (61%) EV and 45 (39%) open procedures. Durability of infrainguinal interventions in claudicants depends mainly on anatomic complexity of disease. Good long-term clinical success can be achieved with both open and EV interventions, albeit with high reintervention rates, especially in patients with TASC II C and D disease. A considerable subset of EV patients will eventually require surgical revascularization to maintain clinical benefit. In this study, almost 20% of patients undergoing EV for TASC II C and D disease eventually required surgical bypass. Copyright © 2018 Elsevier Inc. All rights reserved.
Buxbaum, Joseph D; Bolshakova, Nadia; Brownfeld, Jessica M; Anney, Richard Jl; Bender, Patrick; Bernier, Raphael; Cook, Edwin H; Coon, Hilary; Cuccaro, Michael; Freitag, Christine M; Hallmayer, Joachim; Geschwind, Daniel; Klauck, Sabine M; Nurnberger, John I; Oliveira, Guiomar; Pinto, Dalila; Poustka, Fritz; Scherer, Stephen W; Shih, Andy; Sutcliffe, James S; Szatmari, Peter; Vicente, Astrid M; Vieland, Veronica; Gallagher, Louise
2014-01-01
There is an urgent need for expanding and enhancing autism spectrum disorder (ASD) samples, in order to better understand causes of ASD. In a unique public-private partnership, 13 sites with extensive experience in both the assessment and diagnosis of ASD embarked on an ambitious, 2-year program to collect samples for genetic and phenotypic research and begin analyses on these samples. The program was called The Autism Simplex Collection (TASC). TASC sample collection began in 2008 and was completed in 2010, and included nine sites from North America and four sites from Western Europe, as well as a centralized Data Coordinating Center. Over 1,700 trios are part of this collection, with DNA from transformed cells now available through the National Institute of Mental Health (NIMH). Autism Diagnostic Interview-Revised (ADI-R) and Autism Diagnostic Observation Schedule-Generic (ADOS-G) measures are available for all probands, as are standardized IQ measures, Vineland Adaptive Behavioral Scales (VABS), the Social Responsiveness Scale (SRS), Peabody Picture Vocabulary Test (PPVT), and physical measures (height, weight, and head circumference). At almost every site, additional phenotypic measures were collected, including the Broad Autism Phenotype Questionnaire (BAPQ) and Repetitive Behavior Scale-Revised (RBS-R), as well as the non-word repetition scale, Communication Checklist (Children's or Adult), and Aberrant Behavior Checklist (ABC). Moreover, for nearly 1,000 trios, the Autism Genome Project Consortium (AGP) has carried out Illumina 1 M SNP genotyping and called copy number variation (CNV) in the samples, with data being made available through the National Institutes of Health (NIH). Whole exome sequencing (WES) has been carried out in over 500 probands, together with ancestry matched controls, and this data is also available through the NIH. Additional WES is being carried out by the Autism Sequencing Consortium (ASC), where the focus is on sequencing complete trios. ASC sequencing for the first 1,000 samples (all from whole-blood DNA) is complete and data will be released in 2014. Data is being made available through NIH databases (database of Genotypes and Phenotypes (dbGaP) and National Database for Autism Research (NDAR)) with DNA released in Dist 11.0. Primary funding for the collection, genotyping, sequencing and distribution of TASC samples was provided by Autism Speaks and the NIH, including the National Institute of Mental Health (NIMH) and the National Human Genetics Research Institute (NHGRI). TASC represents an important sample set that leverages expert sites. Similar approaches, leveraging expert sites and ongoing studies, represent an important path towards further enhancing available ASD samples.
Trends in Teacher Evaluation: What Every Special Education Teacher Should Know
ERIC Educational Resources Information Center
Benedict, Amber E; Thomas, Rachel A.; Kimerling, Jenna; Leko, Christopher
2013-01-01
The article reflects on current methods of teacher evaluation within the context of recent accountability policy, specifically No Child Left Behind. An overview is given of the most common forms of teacher evaluation, including performance evaluations, checklists, peer review, portfolios, the CEC and InTASC standards, the Charlotte Danielson…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-23
... (TASC) program. The intended effect of this notice is to solicit applications from the private sector... be considered for funding, applications must be received by 5 p.m. Eastern Daylight Time, May 21.... FOR FURTHER INFORMATION CONTACT: Entities wishing to apply for funding assistance should contact the...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-11
.... The intended effect of this notice is to solicit applications from the private sector and from... description of relevant dates. FOR FURTHER INFORMATION CONTACT: Entities wishing to apply for funding.... Funding Opportunity Description Authority: The TASC program is authorized by section 3205 of Public Law...
ERIC Educational Resources Information Center
Rhodes, Judith L. F.; Thomas, Johanna M.; Lemieux, Catherine M.; Cain, Daphne S.; Guin, Cecile C.
2010-01-01
This article reviews literature describing truancy and its correlates, and it analyzes the current research on truancy prevention programs. Few truancy prevention programs exist in elementary school settings. This article describes Truancy Assessment and Service Centers, a theory-driven program providing case management services to children in 85…
Democracy in Action: Experiential Civics Learning in Afterschool Advocacy Days
ERIC Educational Resources Information Center
Blank, Susan
2006-01-01
Cosponsored by Coalition for After-School Funding (CASF) and The After-School Corporation (TASC), After-School Advocacy Days have been held annually in Albany, NY since 2000. These events are enormously helpful to the two sponsors' efforts to influence officials who make decisions about funding afterschool programs. The annual event is designed to…
Damera, Sheshagiri Rao; Barik, Ramachandra; Prasad, Akula Siva
2016-09-01
The angioplasty of chronic total aortoiliac occlusion using transfemoral is controversial. From March 2014 to December 2015, four consecutive patients (4 males; mean age 58.2±6.8 years; age of range 51-65 years) underwent angioplasty and stenting of TASC-D occlusion. In all the cases, we failed to cross from femoral approach. On switching over to left brachial access, angioplasty was done successfully in all. There was no procedural site complication or clinical evidence cerebral thromboembolism. Self-expandable stents were implanted in all with adequate pre and post dilation. Complete revascularisation was achieved in two cases and in other two cases, the angioplasty to the left aortoiliac carina was staged. Therefore, it is better to avoid femoral approach as initial step to cross chronic TASC 2007 type D (chronic total aortoiliac occlusion or called extensive aortoiliac disease) because of failure to cross retrogradely due to subintimal course of guide wire leading to retrograde aortic dissection. Copyright © 2016 Cardiological Society of India. Published by Elsevier B.V. All rights reserved.
2014-01-01
Background There is an urgent need for expanding and enhancing autism spectrum disorder (ASD) samples, in order to better understand causes of ASD. Methods In a unique public-private partnership, 13 sites with extensive experience in both the assessment and diagnosis of ASD embarked on an ambitious, 2-year program to collect samples for genetic and phenotypic research and begin analyses on these samples. The program was called The Autism Simplex Collection (TASC). TASC sample collection began in 2008 and was completed in 2010, and included nine sites from North America and four sites from Western Europe, as well as a centralized Data Coordinating Center. Results Over 1,700 trios are part of this collection, with DNA from transformed cells now available through the National Institute of Mental Health (NIMH). Autism Diagnostic Interview-Revised (ADI-R) and Autism Diagnostic Observation Schedule-Generic (ADOS-G) measures are available for all probands, as are standardized IQ measures, Vineland Adaptive Behavioral Scales (VABS), the Social Responsiveness Scale (SRS), Peabody Picture Vocabulary Test (PPVT), and physical measures (height, weight, and head circumference). At almost every site, additional phenotypic measures were collected, including the Broad Autism Phenotype Questionnaire (BAPQ) and Repetitive Behavior Scale-Revised (RBS-R), as well as the non-word repetition scale, Communication Checklist (Children’s or Adult), and Aberrant Behavior Checklist (ABC). Moreover, for nearly 1,000 trios, the Autism Genome Project Consortium (AGP) has carried out Illumina 1 M SNP genotyping and called copy number variation (CNV) in the samples, with data being made available through the National Institutes of Health (NIH). Whole exome sequencing (WES) has been carried out in over 500 probands, together with ancestry matched controls, and this data is also available through the NIH. Additional WES is being carried out by the Autism Sequencing Consortium (ASC), where the focus is on sequencing complete trios. ASC sequencing for the first 1,000 samples (all from whole-blood DNA) is complete and data will be released in 2014. Data is being made available through NIH databases (database of Genotypes and Phenotypes (dbGaP) and National Database for Autism Research (NDAR)) with DNA released in Dist 11.0. Primary funding for the collection, genotyping, sequencing and distribution of TASC samples was provided by Autism Speaks and the NIH, including the National Institute of Mental Health (NIMH) and the National Human Genetics Research Institute (NHGRI). Conclusions TASC represents an important sample set that leverages expert sites. Similar approaches, leveraging expert sites and ongoing studies, represent an important path towards further enhancing available ASD samples. PMID:25392729
ExpandED Options: Learning beyond High School Walls
ERIC Educational Resources Information Center
ExpandED Schools, 2014
2014-01-01
Through ExpandED Options by TASC, New York City high school students get academic credit for learning career-related skills that lead to paid summer jobs. Too many high school students--including those most likely to drop out--are bored or see classroom learning as irrelevant. ExpandED Options students live the connection between mastering new…
GED, HiSET and TASC: A Comparison of High School Equivalency Assessments. ECS Education Trends
ERIC Educational Resources Information Center
Zinth, Jennifer
2015-01-01
Until January 2014, the General Educational Development (GED) was the only option for youth and adults lacking a high school diploma, but needing a high school credential to pursue employment opportunities or postsecondary education. However, in January 2014, some states began administering one or both alternatives to the GED--the Educational…
Expanded Schools: Developing Mindsets to Support Academic Success. Research Brief
ERIC Educational Resources Information Center
ExpandED Schools, 2014
2014-01-01
The national demonstration of ExpandED Schools, The After-School Corporation's (TASC) expanded learning model, was launched in 2011-12 in New York City, Baltimore, and New Orleans. The ExpandED Schools demonstration is being evaluated by Policy Studies Associates (PSA) and is rolling out at a time when there is heightened awareness among…
Proceedings of the First Annual NRO-OSL/GSFC-ATS Rideshare Conference
NASA Technical Reports Server (NTRS)
Cutlip, William (Editor)
1999-01-01
This document contains the proceedings of the First Annual NRO-OSL/GSFC-ATS Rideshare Conference. The conference was held April 16-16, 1999, at the Litton/TASC Facility, Dulles, Virginia, and was co-chaired by William Cutlip, Goddard Space Flight Center Access to Space Group, and Jim Liller, National Reconnaissance Office, Office of Space Launch.
ERIC Educational Resources Information Center
ExpandED Schools, 2014
2014-01-01
This guide is a list of tools that can be used in continued implementation of strong programming powered by Social and Emotional Learning (SEL) competencies. This curated resource pulls from across the landscape of policy, research and practice, with a description of each tool gathered directly from its website.
Energy calibration of CALET onboard the International Space Station
NASA Astrophysics Data System (ADS)
Asaoka, Y.; Akaike, Y.; Komiya, Y.; Miyata, R.; Torii, S.; Adriani, O.; Asano, K.; Bagliesi, M. G.; Bigongiari, G.; Binns, W. R.; Bonechi, S.; Bongi, M.; Brogi, P.; Buckley, J. H.; Cannady, N.; Castellini, G.; Checchia, C.; Cherry, M. L.; Collazuol, G.; Di Felice, V.; Ebisawa, K.; Fuke, H.; Guzik, T. G.; Hams, T.; Hareyama, M.; Hasebe, N.; Hibino, K.; Ichimura, M.; Ioka, K.; Ishizaki, W.; Israel, M. H.; Javaid, A.; Kasahara, K.; Kataoka, J.; Kataoka, R.; Katayose, Y.; Kato, C.; Kawanaka, N.; Kawakubo, Y.; Kitamura, H.; Krawczynski, H. S.; Krizmanic, J. F.; Kuramata, S.; Lomtadze, T.; Maestro, P.; Marrocchesi, P. S.; Messineo, A. M.; Mitchell, J. W.; Miyake, S.; Mizutani, K.; Moiseev, A. A.; Mori, K.; Mori, M.; Mori, N.; Motz, H. M.; Munakata, K.; Murakami, H.; Nakagawa, Y. E.; Nakahira, S.; Nishimura, J.; Okuno, S.; Ormes, J. F.; Ozawa, S.; Pacini, L.; Palma, F.; Papini, P.; Penacchioni, A. V.; Rauch, B. F.; Ricciarini, S.; Sakai, K.; Sakamoto, T.; Sasaki, M.; Shimizu, Y.; Shiomi, A.; Sparvoli, R.; Spillantini, P.; Stolzi, F.; Takahashi, I.; Takayanagi, M.; Takita, M.; Tamura, T.; Tateyama, N.; Terasawa, T.; Tomida, H.; Tsunesada, Y.; Uchihori, Y.; Ueno, S.; Vannuccini, E.; Wefel, J. P.; Yamaoka, K.; Yanagita, S.; Yoshida, A.; Yoshida, K.; Yuda, T.
2017-05-01
In August 2015, the CALorimetric Electron Telescope (CALET), designed for long exposure observations of high energy cosmic rays, docked with the International Space Station (ISS) and shortly thereafter began to collect data. CALET will measure the cosmic ray electron spectrum over the energy range of 1 GeV to 20 TeV with a very high resolution of 2% above 100 GeV, based on a dedicated instrument incorporating an exceptionally thick 30 radiation-length calorimeter with both total absorption and imaging (TASC and IMC) units. Each TASC readout channel must be carefully calibrated over the extremely wide dynamic range of CALET that spans six orders of magnitude in order to obtain a degree of calibration accuracy matching the resolution of energy measurements. These calibrations consist of calculating the conversion factors between ADC units and energy deposits, ensuring linearity over each gain range, and providing a seamless transition between neighboring gain ranges. This paper describes these calibration methods in detail, along with the resulting data and associated accuracies. The results presented in this paper show that a sufficient accuracy was achieved for the calibrations of each channel in order to obtain a suitable resolution over the entire dynamic range of the electron spectrum measurement.
ERIC Educational Resources Information Center
Treese, Matthew Paul
2012-01-01
Public school districts in Pennsylvania use varying teacher screening and interviewing processes for hiring teachers. In order to hire the best teacher candidates for vacancies, the qualities of effective teachers such as those cited by the Council of Chief State School Officers Interstate Teacher Assessment and Support Consortium (InTASC) Model…
US Military Aircraft Cost Handbook.
1983-03-01
authority 20. ABSIRACr (Coctinus -,e rve.,. afaw it ncvaea7y ad Identifiy by block number) ... Management Consulting & Research, Inc. ( MCR ) collected...methodology is called TASCFORM-AIR. Management Consulting & Research, Inc. ( MCR ) uner subcontracto-assist in this effort, has developed the US...expanded. This edition was produced as part of the current TASC/ MCR effort for OSD Net Assessment to build on the earlier cost-performance work conducted
Malas, Mahmoud B; Qazi, Umair; Glebova, Natalia; Arhuidese, Isibor; Reifsnyder, Thomas; Black, James; Perler, Bruce A; Freischlag, Julie A
2014-12-01
To our knowledge, there is no level 1 evidence comparing open bypass with angioplasty and stenting in TransAtlantic Inter-Society Consensus (TASC II) B and C superficial femoral artery lesions. The Revascularization With Open Bypass vs Angioplasty and Stenting of the Lower Extremity Trial (ROBUST) is the first prospective randomized clinical trial comparing both treatments. To report the design of the ROBUST trial. The primary aim of the trial is to compare (1) the patency rate (primary, primary assisted, and secondary patency at 6 and 12 months), (2) improvement of quality of life, (3) clinical improvement (at least 1 Rutherford category), and (4) wound healing and limb salvage in patients presenting with critical limb ischemia; secondary aims include (1) cost-effectiveness by factoring procedure and hospital admission costs including rehabilitation, readmission, and reintervention costs, (2) amputation-free survival, (3) reintervention rate, and (4) 30-day operative mortality, morbidity, and wound and access complications. ROBUST is a prospective randomized clinical trial with the aim to enroll 320 patients with intermittent claudication that does not respond to medical management and patients with critical limb ischemia. The maximum level of medical therapy will be administered using antiplatelet agents and statins, as well as measures to control hypertension and diabetes mellitus. Patients with TASC II B or C lesions are prospectively randomized to receive either femoropopliteal bypass or percutaneous transluminal angioplasty and stenting; patients with TASC II A and D lesions are not randomized and receive percutaneous transluminal angioplasty and stenting or femoropopliteal bypass, respectively. All patients will be evaluated at 1, 6, and 12 months postoperatively with physical examination, ankle brachial index, duplex, and a quality-of-life questionnaire. The trial is actively enrolling participants. At the time of writing, 29 patients have been enrolled; most are male (60%) and white (65%). Providing level 1 evidence, ROBUST may help to establish guidelines for the treatment of superficial femoral artery lesions, eliminate unnecessary procedures, and reduce health care costs. clinicaltrials.gov Identifier: NCT01602159.
DeRubertis, Brian G; Vouyouka, Angela; Rhee, Soo J; Califano, Joseph; Karwowski, John; Angle, Niren; Faries, Peter L; Kent, K Craig
2008-07-01
Experience with open surgical bypass suggests similar overall outcomes in women compared with men, but significantly increased risk of wound complications. Percutaneous treatment of lower extremity occlusive disease is therefore an attractive alternative in women, although it is not clear whether there is a difference in outcomes between women and men treated with this technique. We sought to determine the results and predictors of failure in women treated by percutaneous intervention. Percutaneous infrainguinal revascularization was performed on 309 women between 2001 and 2006. Procedures, complications, demographics, comorbidities, and follow-up data were entered into a prospective database for review. Patency was assessed primarily by duplex ultrasonography. Outcomes were expressed by Kaplan-Meier curves and compared by log-rank analysis. A total of 447 percutaneous interventions performed in 309 women were analyzed and compared with 553 interventions in men. Mean age in women was 73.2 years; comorbidities included hypertension (HTN) (86%), diabetes melitus (DM) (58%), chronic renal insufficiency (CRI) (15%), hemodialysis (7%), hypercholesterolemia (52%), coronary artery disease (CAD) (42%), and tobacco use (47%). Indications in women included claudication (38.0%), rest pain (18.8%), and tissue loss (43.2%). Overall primary & secondary patency and limb-salvage rates for women were 38% +/- 4%, 66% +/- 3%, and 80% +/- 4% at 24 months. In this patient sample, women were significantly more likely than men to present with limb-threatening ischemia (61.6% vs 47.3%, P < 0.001) and have lesions of TASC C and D severity (71.4% vs 61.7%, P < .005). However, there were no significant differences in primary and secondary patency rates or limb-salvage rates between genders. Furthermore, while women with limb-threat, diabetes, and advanced TASC severity lesions were at increased risk of failure overall, there were no differences between women and men with these characteristics. Percutaneous infrainguinal revascularization is a very effective modality in women with lower extremity occlusive disease. Although women in this sample were more likely to present with limb-threat than men, patency and limb-salvage rates were equivalent between genders, even in high-risk subsets such as diabetics or those with lesions of increased TASC severity.
ERIC Educational Resources Information Center
Wallace, Belle; Bernardelli, Alessio; Molyneux, Clare; Farrell, Clare
2012-01-01
All children are born with the gifts of curiosity and creativity--and an insatiable appetite for asking questions to find out about the world in which they live. Fostering these questions and developing inquisitive and investigating minds is one of the essential roles of parent and teacher, and the processes of enquiry are the necessary routes for…
Kopp, R; Weidenhagen, R; Hornung, H; Jauch, K W; Lauterjung, L
2003-12-01
The diagnosis of acute peripheral ischemia can be obtained based on clinical presentation, inspection, and palpation of the affected extremity. Unfractionated heparin as a single shot is immediately given followed by continuous infusion when diagnosis is clinically evident and contraindications are excluded. Thromboembolectomy using a Fogarty catheter is immediately performed in patients with evidence of arterial embolization and signs of advanced ischemia (TASC IIb/III) followed by intraoperative angiography. Patients with evidence of arterial thrombosis require urgent angiography followed by thrombectomy and probably subsequent endovascular or surgical interventions and vascular reconstruction. For patients with moderate ischemia (TASC IIa), initial diagnostic angiography is recommended followed by primary thrombectomy with subsequent intraoperative angiography and immediate endovascular or operative treatment of remaining vascular problems. As an alternative therapeutic option initial catheter-guided local thrombolysis can be performed in selected patients with the intention of subsequent limb revascularization or unmasking relevant vessel alterations leading to specific endovascular or surgically performed vascular reconstruction. Possible development of muscle ischemia because of increased compartment pressure should be considered and fasciotomy performed when indicated. Primary amputation of the severely ischemic limb after initial thrombectomy might be recommended in patients with life-threatening organ failure related to muscle necrosis.
Effective Acquisition Strategies for Systems Engineering and Technical Assistance (SETA)
2012-07-01
Respondents should be aware that notwithstanding any other provision of law, no person shall be subject to a penalty for failing to comply with a collection...environment will consist of various distributed data sources, or nodes, that are fused together in order to enhance battlefield awareness . In the event that...Northrop Grumman sold its advisory services business, TASC, to private- equity investors (see the Appedix). In August 2010, CSC announced the sale of
Visual Search in the Detection of Retinal Injury: A Feasibility Study
2013-04-01
D, Heyes A. et al. Mobility of people with retinitis pigmentosa as a function of vision and psychological variables. Optometry and Vision Science...AFRL-RH-FS-TR-2013-0019 Visual Search in the Detection of Retinal Injury: A Feasibility Study Thomas Kuyk TASC, Inc. Lei Liu The...Detection of Retinal Injury: A Feasibility Study" 2013 0019 LEON N. McLIN, JR., DR-III, DAF Work Unit Manager 711 HPW/ RHDO POLHAMUS.GARR ETT.D
Software component quality evaluation
NASA Technical Reports Server (NTRS)
Clough, A. J.
1991-01-01
The paper describes a software inspection process that can be used to evaluate the quality of software components. Quality criteria, process application, independent testing of the process and proposed associated tool support are covered. Early results indicate that this technique is well suited for assessing software component quality in a standardized fashion. With automated machine assistance to facilitate both the evaluation and selection of software components, such a technique should promote effective reuse of software components.
An Approximate Analytical Model of Shock Waves from Underground Nuclear Explosions
1990-12-01
Explosions, University of California Radiation Laboratory, Rep. UCRL -5675,1 pp. 120 134, 1959. Perret, W. R., and R. C. Bass, Free-field ground motion...Park, PA 16802 Blacksburg, VA 24061 Dr. Ralph Alewine, Ii Dr. Stephen Bratt DARPAJNMRO Center for Seismic Studies 3701 North Fairfax Drive 1300 North...DARP,,NMRO Patrick AFB, FL 3 2925-6001 3701 North Fairfax Drive Arlington, VA 222CN-171a l)r. Richard Sailor Donald L. Springer TASC, Inc. Lawrence
Training Decisions Technology Analysis
1992-06-01
4.5.1 Relational Data Base Management 69 4.5.2 TASCS Data Content 69 4.5.3 Relationships with TDS 69 4.6 Other Air Force Modeling R&D 70 4.6.1 Time ...executive decision making were first developed by M. S. Scott Morton in the early 1970’s who, at that time , termed them " management decision systems" (Scott...Allocations to Training Settings o Managers ’ Preferences for Task Allocations to Training Settings o Times Required to Training Tasks in Various
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-25
..., Software, Implants, and Components Thereof; Notice of Receipt of Complaint; Solicitation of Comments... Certain Computerized Orthopedic Surgical Devices, Software, Implants, and Components Thereof, DN 2945; the... importation of certain computerized orthopedic surgical devices, software, implants, and components thereof...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-17
..., Components Thereof, Associated Software, and Products Containing the Same; Notice of Investigation AGENCY: U... scanning devices, components thereof, associated software, and products containing the same by reason of... after importation of certain biometric scanning devices, components thereof, associated software, or...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-02
... Software and Firmware, and Components Thereof and Products Containing the Same; Institution of..., related software and firmware, and components thereof and products containing the same by reason of... after importation of certain cameras and mobile devices, related software and firmware, and components...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-07
... Products, Components Thereof, and Related Software; Notice of Institution of Investigation; Institution of... importation of certain GPS navigation products, components thereof, and related software by reason of... importation of certain GPS navigation products, components thereof, and related software that infringe one or...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-13
... Software, and Components Thereof Final Determination of Violation; Issuance of a Limited Exclusion Order... importation of certain mobile devices, associated software, and components thereof by reason of infringement... importation of certain mobile devices, associated software, and components thereof containing same by reason...
Subintimal angioplasty with the aid of a re-entry device for TASC C and D lesions of the SFA.
Setacci, C; Chisci, E; de Donato, G; Setacci, F; Iacoponi, F; Galzerano, G
2009-07-01
The aim of this prospective study was to assess the clinical effectiveness and related midterm patency of subintimal angioplasty (SAP) in patients suffering from critical limb ischaemia (CLI) in a single tertiary care university centre. The secondary aim was to evaluate the safety and clinical effectiveness of using a re-entry device when re-canalisation by SAP was unsuccessful. From January 2005 to December 2007, consecutive patients suffering from CLI (Rutherford clinical categories: 4-6) were treated with SAP. All patients included in the study had occluded SFA (TASC C and D) and underwent clinical and ultrasound follow-up examinations at day 30 and at 3, 6, 9 and 12 months, and then yearly. A re-entry device (Outback, Cordis Corporation, Miami Lakes, Florida, USA in all cases) was only used when re-canalisation by simple SAP was unsuccessful, and stenting was used when residual stenosis was >30% or there was a flow-limiting dissection. Factors that could modify the outcome were analysed. In this study, 145 patients were treated, with a technical success rate of 83.5% (121 of 145) for simple SAP. Stenting was performed in 43% (n=62) of successful SAP procedures. No death occurred in the perioperative period, while the 30-day mortality was 4.8% (7 of 145). The re-entry device (Outback) was used in 24 cases (16.5%). The technical success of the re-entry device was 79% (19 of 24), with a 90% success rate of stent placement at the site of re-entry. Complications occurred in 6.2% of all procedures (n=9) (three arterial perforations (2.1%), three distal embolisations (2.1%), two femoral artery pseudo-aneurysms (1.4%) and one arterio-venous fistula (0.7%)). Factors capable of independently affecting the patency were renal insufficiency (p=0.03), current smoking (p=0.01) and diabetes (p=0.04). The primary patency at 1 and 3 years was 70% and 34% and the secondary patency at 1 and 3 years was 77% and 43%, respectively. At the same time intervals, the limb-salvage rate was 88% and 49%. SAP with the aid of a re-entry device for TASC C and D lesions of the SFA seems to be safe and clinically effective in patients suffering from CLI, according to the experience at our centre. Further follow-up and more data are necessary to confirm these findings.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-23
... Entertainment Consoles, Related Software, and Components Thereof; Notice of Investigation AGENCY: U.S..., related software, and components thereof by reason of infringement of certain claims of U.S. Patent No. 5... gaming and entertainment consoles, related software, and components thereof that infringe one or more of...
Leveraging Existing Mission Tools in a Re-Usable, Component-Based Software Environment
NASA Technical Reports Server (NTRS)
Greene, Kevin; Grenander, Sven; Kurien, James; z,s (fshir. z[orttr); z,scer; O'Reilly, Taifun
2006-01-01
Emerging methods in component-based software development offer significant advantages but may seem incompatible with existing mission operations applications. In this paper we relate our positive experiences integrating existing mission applications into component-based tools we are delivering to three missions. In most operations environments, a number of software applications have been integrated together to form the mission operations software. In contrast, with component-based software development chunks of related functionality and data structures, referred to as components, can be individually delivered, integrated and re-used. With the advent of powerful tools for managing component-based development, complex software systems can potentially see significant benefits in ease of integration, testability and reusability from these techniques. These benefits motivate us to ask how component-based development techniques can be relevant in a mission operations environment, where there is significant investment in software tools that are not component-based and may not be written in languages for which component-based tools even exist. Trusted and complex software tools for sequencing, validation, navigation, and other vital functions cannot simply be re-written or abandoned in order to gain the advantages offered by emerging component-based software techniques. Thus some middle ground must be found. We have faced exactly this issue, and have found several solutions. Ensemble is an open platform for development, integration, and deployment of mission operations software that we are developing. Ensemble itself is an extension of an open source, component-based software development platform called Eclipse. Due to the advantages of component-based development, we have been able to vary rapidly develop mission operations tools for three surface missions by mixing and matching from a common set of mission operation components. We have also had to determine how to integrate existing mission applications for sequence development, sequence validation, and high level activity planning, and other functions into a component-based environment. For each of these, we used a somewhat different technique based upon the structure and usage of the existing application.
A database for TMT interface control documents
NASA Astrophysics Data System (ADS)
Gillies, Kim; Roberts, Scott; Brighton, Allan; Rogers, John
2016-08-01
The TMT Software System consists of software components that interact with one another through a software infrastructure called TMT Common Software (CSW). CSW consists of software services and library code that is used by developers to create the subsystems and components that participate in the software system. CSW also defines the types of components that can be constructed and their roles. The use of common component types and shared middleware services allows standardized software interfaces for the components. A software system called the TMT Interface Database System was constructed to support the documentation of the interfaces for components based on CSW. The programmer describes a subsystem and each of its components using JSON-style text files. A command interface file describes each command a component can receive and any commands a component sends. The event interface files describe status, alarms, and events a component publishes and status and events subscribed to by a component. A web application was created to provide a user interface for the required features. Files are ingested into the software system's database. The user interface allows browsing subsystem interfaces, publishing versions of subsystem interfaces, and constructing and publishing interface control documents that consist of the intersection of two subsystem interfaces. All published subsystem interfaces and interface control documents are versioned for configuration control and follow the standard TMT change control processes. Subsystem interfaces and interface control documents can be visualized in the browser or exported as PDF files.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-25
... Computing Devices, Related Software, and Components Thereof; Notice of Investigation AGENCY: U.S... devices, related software, and components thereof by reason of infringement of certain claims of U.S... devices, related software, and components thereof that infringe one or more of claims 1 and 5 of the '372...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-02
... Hardware and Software Components Thereof; Notice of Investigation AGENCY: U.S. International Trade... boxes, and hardware and software components thereof by reason of infringement of certain claims of U.S... after importation of certain set-top boxes, and hardware and software components thereof that infringe...
2016-01-06
of- breed software components and software products lines (SPLs) that are subject to different IP license and cybersecurity requirements. The... commercially priced closed source software components, to be used in the design, implementation, deployment, and evolution of open architecture (OA... breed software components and software products lines (SPLs) that are subject to different IP license and cybersecurity requirements. The Department
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-05
..., Associated Software, and Components Thereof; Notice of Investigation AGENCY: U.S. International Trade..., associated software, and components thereof by reason of infringement of certain claims of U.S. Patent No. 5..., associated software, and components thereof that infringe one or more of claims 1-4, 22, 26, 31, and 36 of...
Component-based integration of chemistry and optimization software.
Kenny, Joseph P; Benson, Steven J; Alexeev, Yuri; Sarich, Jason; Janssen, Curtis L; McInnes, Lois Curfman; Krishnan, Manojkumar; Nieplocha, Jarek; Jurrus, Elizabeth; Fahlstrom, Carl; Windus, Theresa L
2004-11-15
Typical scientific software designs make rigid assumptions regarding programming language and data structures, frustrating software interoperability and scientific collaboration. Component-based software engineering is an emerging approach to managing the increasing complexity of scientific software. Component technology facilitates code interoperability and reuse. Through the adoption of methodology and tools developed by the Common Component Architecture Forum, we have developed a component architecture for molecular structure optimization. Using the NWChem and Massively Parallel Quantum Chemistry packages, we have produced chemistry components that provide capacity for energy and energy derivative evaluation. We have constructed geometry optimization applications by integrating the Toolkit for Advanced Optimization, Portable Extensible Toolkit for Scientific Computation, and Global Arrays packages, which provide optimization and linear algebra capabilities. We present a brief overview of the component development process and a description of abstract interfaces for chemical optimizations. The components conforming to these abstract interfaces allow the construction of applications using different chemistry and mathematics packages interchangeably. Initial numerical results for the component software demonstrate good performance, and highlight potential research enabled by this platform.
NASA Technical Reports Server (NTRS)
Hall, Laverne; Hung, Chaw-Kwei; Lin, Imin
2000-01-01
The purpose of this paper is to provide a description of NASA JPL Distributed Systems Technology (DST) Section's object-oriented component approach to open inter-operable systems software development and software reuse. It will address what is meant by the terminology object component software, give an overview of the component-based development approach and how it relates to infrastructure support of software architectures and promotes reuse, enumerate on the benefits of this approach, and give examples of application prototypes demonstrating its usage and advantages. Utilization of the object-oriented component technology approach for system development and software reuse will apply to several areas within JPL, and possibly across other NASA Centers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1998-08-01
An estimated 85% of the installed base of software is a custom application with a production quantity of one. In practice, almost 100% of military software systems are custom software. Paradoxically, the marginal costs of producing additional units are near zero. So why hasn`t the software market, a market with high design costs and low productions costs evolved like other similar custom widget industries, such as automobiles and hardware chips? The military software industry seems immune to market pressures that have motivated a multilevel supply chain structure in other widget industries: design cost recovery, improve quality through specialization, and enablemore » rapid assembly from purchased components. The primary goal of the ComponentWare Consortium (CWC) technology plan was to overcome barriers to building and deploying mission-critical information systems by using verified, reusable software components (Component Ware). The adoption of the ComponentWare infrastructure is predicated upon a critical mass of the leading platform vendors` inevitable adoption of adopting emerging, object-based, distributed computing frameworks--initially CORBA and COM/OLE. The long-range goal of this work is to build and deploy military systems from verified reusable architectures. The promise of component-based applications is to enable developers to snap together new applications by mixing and matching prefabricated software components. A key result of this effort is the concept of reusable software architectures. A second important contribution is the notion that a software architecture is something that can be captured in a formal language and reused across multiple applications. The formalization and reuse of software architectures provide major cost and schedule improvements. The Unified Modeling Language (UML) is fast becoming the industry standard for object-oriented analysis and design notation for object-based systems. However, the lack of a standard real-time distributed object operating system, lack of a standard Computer-Aided Software Environment (CASE) tool notation and lack of a standard CASE tool repository has limited the realization of component software. The approach to fulfilling this need is the software component factory innovation. The factory approach takes advantage of emerging standards such as UML, CORBA, Java and the Internet. The key technical innovation of the software component factory is the ability to assemble and test new system configurations as well as assemble new tools on demand from existing tools and architecture design repositories.« less
Toward Reusable Graphics Components in Ada
1993-03-01
Then alternatives for obtaining well- engineered reusable software components were examined. Finally, the alternatives were analyzed, and the most...reusable software components. Chapter 4 describes detailed design and implementation strategies in building a well- engineered reusable set of components in...study. 2.2 The Object-Oriented Paradigm 2.2.1 The Need for Object-Oriented Techniques. Among software engineers the software crisis is a well known
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-04
... INTERNATIONAL TRADE COMMISSION [DN 2891] Certain Cameras and Mobile Devices, Related Software and... complaint entitled Certain Cameras and Mobile Devices, Related Software and Firmware, and Components Thereof... cameras and mobile devices, related software and firmware, and components thereof and products containing...
Software packager user's guide
NASA Technical Reports Server (NTRS)
Callahan, John R.
1995-01-01
Software integration is a growing area of concern for many programmers and software managers because the need to build new programs quickly from existing components is greater than ever. This includes building versions of software products for multiple hardware platforms and operating systems, building programs from components written in different languages, and building systems from components that must execute on different machines in a distributed network. The goal of software integration is to make building new programs from existing components more seamless -- programmers should pay minimal attention to the underlying configuration issues involved. Libraries of reusable components and classes are important tools but only partial solutions to software development problems. Even though software components may have compatible interfaces, there may be other reasons, such as differences between execution environments, why they cannot be integrated. Often, components must be adapted or reimplemented to fit into another application because of implementation differences -- they are implemented in different programming languages, dependent on different operating system resources, or must execute on different physical machines. The software packager is a tool that allows programmers to deal with interfaces between software components and ignore complex integration details. The packager takes modular descriptions of the structure of a software system written in the package specification language and produces an integration program in the form of a makefile. If complex integration tools are needed to integrate a set of components, such as remote procedure call stubs, their use is implied by the packager automatically and stub generation tools are invoked in the corresponding makefile. The programmer deals only with the components themselves and not the details of how to build the system on any given platform.
Using an architectural approach to integrate heterogeneous, distributed software components
NASA Technical Reports Server (NTRS)
Callahan, John R.; Purtilo, James M.
1995-01-01
Many computer programs cannot be easily integrated because their components are distributed and heterogeneous, i.e., they are implemented in diverse programming languages, use different data representation formats, or their runtime environments are incompatible. In many cases, programs are integrated by modifying their components or interposing mechanisms that handle communication and conversion tasks. For example, remote procedure call (RPC) helps integrate heterogeneous, distributed programs. When configuring such programs, however, mechanisms like RPC must be used explicitly by software developers in order to integrate collections of diverse components. Each collection may require a unique integration solution. This paper describes improvements to the concepts of software packaging and some of our experiences in constructing complex software systems from a wide variety of components in different execution environments. Software packaging is a process that automatically determines how to integrate a diverse collection of computer programs based on the types of components involved and the capabilities of available translators and adapters in an environment. Software packaging provides a context that relates such mechanisms to software integration processes and reduces the cost of configuring applications whose components are distributed or implemented in different programming languages. Our software packaging tool subsumes traditional integration tools like UNIX make by providing a rule-based approach to software integration that is independent of execution environments.
Remote Iliac Artery Endarterectomy: A Case Series and Systematic Review.
Bekken, Joost A; de Boer, Sanne W; van der Sluijs, Rogier; Jongsma, Hidde; de Vries, Jean-Paul P M; Fioole, Bram
2018-02-01
To evaluate the long-term results of remote iliac artery endarterectomy (RIAE) in 2 vascular referral centers and review existing literature. A retrospective analysis was conducted of 109 consecutive patients (mean age 64.2±10.7 years; 72 men) who underwent 113 RIAE procedures for lower limb ischemia from January 2004 to August 2015 at 2 vascular centers. The majority of limbs (82, 72.6%) had TASC II D lesions (31 TASC II C). Primary outcome measures were primary, assisted primary, and secondary patency. A comprehensive literature search was performed in the PubMed and EMBASE databases to identify all English-language studies published after 1990 reporting the results of RIAE. Technical success was achieved in 95 (84.1%) of the 113 procedures. The complication rate was 13.7%, and 30-day mortality was 0%. At 5 years, primary patency was 78.2%, assisted primary patency was 83.4%, and secondary patency was 86.7%. Hemodynamic success was obtained in 91.7% of patients, and clinical improvement was observed in 95.2%. Freedom from major amputation was 94.7% at 5 years. The systematic review comprised 6 studies including 419 RIAEs, and pooled data showed results similar to the current study. For external iliac artery occlusions extending into the common femoral artery, RIAE appears to be a valuable hybrid treatment option. It combines acceptable morbidity and low mortality with good long-term patency. It has some advantages over an open surgical iliofemoral bypass or complete endovascular revascularization and could be the best treatment option in selected cases.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-08
... Software, and Components Thereof; Determination To Review Final Initial Determination AGENCY: U.S..., and the sale within the United States after importation of certain mobile devices, associated software... software, and components thereof containing same by reason of infringement of one or more of claims 1, 2, 5...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-03
..., Components Thereof, and Related Software; Institution of Investigation AGENCY: U.S. International Trade... navigation products, components thereof, and related software by reason of infringement of certain claims of... related software that infringe one or more of claims 1, 2, 11, and 16 of the '565 patent; claim 1 of the...
Engineering intelligent tutoring systems
NASA Technical Reports Server (NTRS)
Warren, Kimberly C.; Goodman, Bradley A.
1993-01-01
We have defined an object-oriented software architecture for Intelligent Tutoring Systems (ITS's) to facilitate the rapid development, testing, and fielding of ITS's. This software architecture partitions the functionality of the ITS into a collection of software components with well-defined interfaces and execution concept. The architecture was designed to isolate advanced technology components, partition domain dependencies, take advantage of the increased availability of commercial software packages, and reduce the risks involved in acquiring ITS's. A key component of the architecture, the Executive, is a publish and subscribe message handling component that coordinates all communication between ITS components.
Knowledge-based reusable software synthesis system
NASA Technical Reports Server (NTRS)
Donaldson, Cammie
1989-01-01
The Eli system, a knowledge-based reusable software synthesis system, is being developed for NASA Langley under a Phase 2 SBIR contract. Named after Eli Whitney, the inventor of interchangeable parts, Eli assists engineers of large-scale software systems in reusing components while they are composing their software specifications or designs. Eli will identify reuse potential, search for components, select component variants, and synthesize components into the developer's specifications. The Eli project began as a Phase 1 SBIR to define a reusable software synthesis methodology that integrates reusabilityinto the top-down development process and to develop an approach for an expert system to promote and accomplish reuse. The objectives of the Eli Phase 2 work are to integrate advanced technologies to automate the development of reusable components within the context of large system developments, to integrate with user development methodologies without significant changes in method or learning of special languages, and to make reuse the easiest operation to perform. Eli will try to address a number of reuse problems including developing software with reusable components, managing reusable components, identifying reusable components, and transitioning reuse technology. Eli is both a library facility for classifying, storing, and retrieving reusable components and a design environment that emphasizes, encourages, and supports reuse.
Todd, Kevin E; Ahanchi, Sadaf S; Maurer, Christian A; Kim, Jung H; Chipman, Candice R; Panneton, Jean M
2013-10-01
Endovascular adjuncts, like atherectomy, were developed to improve outcomes of endovascular arterial interventions. The true impact of atherectomy on endovascular outcomes remains to be determined, and little data exist on the influence of atherectomy on tibial interventions. Our study compares early and late outcomes of tibial intervention with angioplasty vs atherectomy-assisted interventions. We completed a retrospective review of all tibial interventions between 2008 and 2010. Outcomes were analyzed using single and multivariate analysis, Cox regression, and Kaplan-Meier curves. Primary outcomes were primary, primary assisted, and secondary patency rates, as well as limb salvage and survival rates. Over a 2-year period, 480 tibial interventions were completed for 421 patients. Eighty-seven percent (n = 418) of interventions were performed for critical limb ischemia (CLI) and 13% (n = 62) for claudication. The CLI cohort of 418 interventions was analyzed. These patients had a mean age of 71 years with a mean follow-up time of 16 ± 15 months (range, 0-59 months). Of the 418 interventions, 339 underwent percutaneous transluminal angioplasty (PTA): 333 PTA alone, six PTA + stent. The remaining 79 interventions received atherectomy: 33 laser, 13 directional, and 33 orbital either alone or in conjunction with PTA (11 atherectomy only, 68 atherectomy + PTA). The groups did not differ significantly in terms of demographics, risk factors, or technical success. The atherectomy group had more TASC B lesions (54% vs 38%; P = .013), while the PTA-alone group had more TASC D lesions (25% vs 13%; P = .004). TASC A and C lesions did not differ significantly between the groups. No significant differences existed with respect to the early (30-day) outcomes of loss of patency (11% vs 13%; P = .699), complications (8% vs 13%; P = .292), or major amputation (17% vs 13%; P = .344) in the PTA-alone group vs the atherectomy-assisted group. Kaplan-Meier analysis revealed no difference for all primary outcomes of PTA alone vs the atherectomy-assisted group at 12 and 36 months: primary patency (69%, 55% vs 61%, 46%; P = .158), primary assisted patency (83%, 71% vs 85%, 67%; P = .801), secondary patency (94%, 89% vs 95%, 89%; P = .892), limb salvage (79%, 70% vs 81%, 77%; P = .485), or survival (77%, 56% vs 80%, 50%; P = .944). The adjunctive use of atherectomy offered no improvement in primary outcomes over PTA alone in either early or late outcomes in CLI patients who underwent endovascular tibial interventions. Considering the additional cost and increased procedural time, these findings put into question the routine use of adjunctive atherectomy. Copyright © 2013 Society for Vascular Surgery. Published by Mosby, Inc. All rights reserved.
Encyclopedia of software components
NASA Technical Reports Server (NTRS)
Vanwarren, Lloyd (Inventor); Beckman, Brian C. (Inventor)
1991-01-01
Intelligent browsing through a collection of reusable software components is facilitated with a computer having a video monitor and a user input interface such as a keyboard or a mouse for transmitting user selections, by presenting a picture of encyclopedia volumes with respective visible labels referring to types of software, in accordance with a metaphor in which each volume includes a page having a list of general topics under the software type of the volume and pages having lists of software components for each one of the generic topics, altering the picture to open one of the volumes in response to an initial user selection specifying the one volume to display on the monitor a picture of the page thereof having the list of general topics and altering the picture to display the page thereof having a list of software components under one of the general topics in response to a next user selection specifying the one general topic, and then presenting a picture of a set of different informative plates depicting different types of information about one of the software components in response to a further user selection specifying the one component.
Encyclopedia of Software Components
NASA Technical Reports Server (NTRS)
Warren, Lloyd V. (Inventor); Beckman, Brian C. (Inventor)
1997-01-01
Intelligent browsing through a collection of reusable software components is facilitated with a computer having a video monitor and a user input interface such as a keyboard or a mouse for transmitting user selections, by presenting a picture of encyclopedia volumes with respective visible labels referring to types of software, in accordance with a metaphor in which each volume includes a page having a list of general topics under the software type of the volume and pages having lists of software components for each one of the generic topics, altering the picture to open one of the volumes in response to an initial user selection specifying the one volume to display on the monitor a picture of the page thereof having the list of general topics and altering the picture to display the page thereof having a list of software components under one of the general topics in response to a next user selection specifying the one general topic, and then presenting a picture of a set of different informative plates depicting different types of information about one of the software components in response to a further user selection specifying the one component.
VIDANA: Data Management System for Nano Satellites
NASA Astrophysics Data System (ADS)
Montenegro, Sergio; Walter, Thomas; Dilger, Erik
2013-08-01
A Vidana data management system is a network of software and hardware components. This implies a software network, a hardware network and a smooth connection between both of them. Our strategy is based on our innovative middleware. A reliable interconnection network (SW & HW) which can interconnect many unreliable redundant components such as sensors, actuators, communication devices, computers, and storage elements,... and software components! Component failures are detected, the affected device is disabled and its function is taken over by a redundant component. Our middleware doesn't connect only software, but also devices and software together. Software and hardware communicate with each other without having to distinguish which functions are in software and which are implemented in hardware. Components may be turned on and off at any time, and the whole system will autonomously adapt to its new configuration in order to continue fulfilling its task. In VIDANA we aim dynamic adaptability (run tine), static adaptability (tailoring), and unified HW/SW communication protocols. For many of these aspects we use "learn from the nature" where we can find astonishing reference implementations.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-04
... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-769] Certain Handheld Electronic Computing Devices, Related Software, and Components Thereof; Termination of the Investigation Based on... electronic computing devices, related software, and components thereof by reason of infringement of certain...
Code of Federal Regulations, 2014 CFR
2014-04-01
..., parts, firmware, software, and systems. 121.8 Section 121.8 Foreign Relations DEPARTMENT OF STATE...-items, components, accessories, attachments, parts, firmware, software, and systems. (a) An end-item is.... Firmware includes but is not limited to circuits into which software has been programmed. (f) Software...
Code of Federal Regulations, 2012 CFR
2012-04-01
..., parts, firmware, software and systems. 121.8 Section 121.8 Foreign Relations DEPARTMENT OF STATE...-items, components, accessories, attachments, parts, firmware, software and systems. (a) An end-item is.... Firmware includes but is not limited to circuits into which software has been programmed. (f) Software...
Code of Federal Regulations, 2013 CFR
2013-04-01
..., parts, firmware, software and systems. 121.8 Section 121.8 Foreign Relations DEPARTMENT OF STATE...-items, components, accessories, attachments, parts, firmware, software and systems. (a) An end-item is.... Firmware includes but is not limited to circuits into which software has been programmed. (f) Software...
Code of Federal Regulations, 2010 CFR
2010-04-01
..., parts, firmware, software and systems. 121.8 Section 121.8 Foreign Relations DEPARTMENT OF STATE...-items, components, accessories, attachments, parts, firmware, software and systems. (a) An end-item is.... Firmware includes but is not limited to circuits into which software has been programmed. (f) Software...
Code of Federal Regulations, 2011 CFR
2011-04-01
..., parts, firmware, software and systems. 121.8 Section 121.8 Foreign Relations DEPARTMENT OF STATE...-items, components, accessories, attachments, parts, firmware, software and systems. (a) An end-item is.... Firmware includes but is not limited to circuits into which software has been programmed. (f) Software...
Software Reuse Within the Earth Science Community
NASA Technical Reports Server (NTRS)
Marshall, James J.; Olding, Steve; Wolfe, Robert E.; Delnore, Victor E.
2006-01-01
Scientific missions in the Earth sciences frequently require cost-effective, highly reliable, and easy-to-use software, which can be a challenge for software developers to provide. The NASA Earth Science Enterprise (ESE) spends a significant amount of resources developing software components and other software development artifacts that may also be of value if reused in other projects requiring similar functionality. In general, software reuse is often defined as utilizing existing software artifacts. Software reuse can improve productivity and quality while decreasing the cost of software development, as documented by case studies in the literature. Since large software systems are often the results of the integration of many smaller and sometimes reusable components, ensuring reusability of such software components becomes a necessity. Indeed, designing software components with reusability as a requirement can increase the software reuse potential within a community such as the NASA ESE community. The NASA Earth Science Data Systems (ESDS) Software Reuse Working Group is chartered to oversee the development of a process that will maximize the reuse potential of existing software components while recommending strategies for maximizing the reusability potential of yet-to-be-designed components. As part of this work, two surveys of the Earth science community were conducted. The first was performed in 2004 and distributed among government employees and contractors. A follow-up survey was performed in 2005 and distributed among a wider community, to include members of industry and academia. The surveys were designed to collect information on subjects such as the current software reuse practices of Earth science software developers, why they choose to reuse software, and what perceived barriers prevent them from reusing software. In this paper, we compare the results of these surveys, summarize the observed trends, and discuss the findings. The results are very similar, with the second, larger survey confirming the basic results of the first, smaller survey. The results suggest that reuse of ESE software can drive down the cost and time of system development, increase flexibility and responsiveness of these systems to new technologies and requirements, and increase effective and accountable community participation.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-21
... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-810] Certain Navigation Products, Components Thereof, and Related Software; Determination Not To Review an Initial Determination Granting a... United States after importation of certain navigation products, components thereof, and related software...
Teaching Software Componentization: A Bar Chart Java Bean
ERIC Educational Resources Information Center
Mitri, Michel
2010-01-01
In the current object-oriented paradigm, software construction increasingly involves creating and utilizing "software components". These components can serve a variety of functions, from common algorithmic processes to database connectivity to graphical interfaces. The advantage of component architectures is that programmers can use pre-existing…
Reducing Risk in DoD Software-Intensive Systems Development
2016-03-01
intensive systems development risk. This research addresses the use of the Technical Readiness Assessment (TRA) using the nine-level software Technology...The software TRLs are ineffective in reducing technical risk for the software component development. • Without the software TRLs, there is no...effective method to perform software TRA or reduce the technical development risk. The software component will behave as a new, untried technology in nearly
A component-based software environment for visualizing large macromolecular assemblies.
Sanner, Michel F
2005-03-01
The interactive visualization of large biological assemblies poses a number of challenging problems, including the development of multiresolution representations and new interaction methods for navigating and analyzing these complex systems. An additional challenge is the development of flexible software environments that will facilitate the integration and interoperation of computational models and techniques from a wide variety of scientific disciplines. In this paper, we present a component-based software development strategy centered on the high-level, object-oriented, interpretive programming language: Python. We present several software components, discuss their integration, and describe some of their features that are relevant to the visualization of large molecular assemblies. Several examples are given to illustrate the interoperation of these software components and the integration of structural data from a variety of experimental sources. These examples illustrate how combining visual programming with component-based software development facilitates the rapid prototyping of novel visualization tools.
Software Management Environment (SME): Components and algorithms
NASA Technical Reports Server (NTRS)
Hendrick, Robert; Kistler, David; Valett, Jon
1994-01-01
This document presents the components and algorithms of the Software Management Environment (SME), a management tool developed for the Software Engineering Branch (Code 552) of the Flight Dynamics Division (FDD) of the Goddard Space Flight Center (GSFC). The SME provides an integrated set of visually oriented experienced-based tools that can assist software development managers in managing and planning software development projects. This document describes and illustrates the analysis functions that underlie the SME's project monitoring, estimation, and planning tools. 'SME Components and Algorithms' is a companion reference to 'SME Concepts and Architecture' and 'Software Engineering Laboratory (SEL) Relationships, Models, and Management Rules.'
Acquisition Handbook - Update. Comprehensive Approach to Reusable Defensive Software (CARDS)
1994-03-25
designs, and implementation components (source code, test plans, procedures and results, and system/software documentation). This handbook provides a...activities where software components are acquired, evaluated, tested and sometimes modified. In addition to serving as a facility for the acquisition and...systems from such components [1]. Implementation components are at the lowest level and consist of: specifications; detailed designs; code, test
Cohen, Julien G; Goo, Jin Mo; Yoo, Roh-Eul; Park, Chang Min; Lee, Chang Hyun; van Ginneken, Bram; Chung, Doo Hyun; Kim, Young Tae
2016-12-01
To evaluate the performance of software in segmenting ground-glass and solid components of subsolid nodules in pulmonary adenocarcinomas. Seventy-three pulmonary adenocarcinomas manifesting as subsolid nodules were included. Two radiologists measured the maximal axial diameter of the ground-glass components on lung windows and that of the solid components on lung and mediastinal windows. Nodules were segmented using software by applying five (-850 HU to -650 HU) and nine (-130 HU to -500 HU) attenuation thresholds. We compared the manual and software measurements of ground-glass and solid components with pathology measurements of tumour and invasive components. Segmentation of ground-glass components at a threshold of -750 HU yielded mean differences of +0.06 mm (p = 0.83, 95 % limits of agreement, 4.51 to 4.67) and -2.32 mm (p < 0.001, -8.27 to 3.63) when compared with pathology and manual measurements, respectively. For solid components, mean differences between the software (at -350 HU) and pathology measurements and between the manual (lung and mediastinal windows) and pathology measurements were -0.12 mm (p = 0.74, -5.73 to 5.55]), 0.15 mm (p = 0.73, -6.92 to 7.22), and -1.14 mm (p < 0.001, -7.93 to 5.64), respectively. Software segmentation of ground-glass and solid components in subsolid nodules showed no significant difference with pathology. • Software can effectively segment ground-glass and solid components in subsolid nodules. • Software measurements show no significant difference with pathology measurements. • Manual measurements are more accurate on lung windows than on mediastinal windows.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-16
... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-761] Certain Set-Top Boxes, and Hardware and Software Components Thereof; Determination Not To Review Initial Determination Terminating... certain set-top boxes, and hardware and software components thereof by reason of infringement of various...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-22
... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-783] Certain GPS Navigation Products, Components Thereof, and Related Software; Termination of Investigation on the Basis of Settlement AGENCY: U.S... GPS navigation products, components thereof, and related software, by reason of the infringement of...
Pybus -- A Python Software Bus
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lavrijsen, Wim T.L.P.
2004-10-14
A software bus, just like its hardware equivalent, allows for the discovery, installation, configuration, loading, unloading, and run-time replacement of software components, as well as channeling of inter-component communication. Python, a popular open-source programming language, encourages a modular design on software written in it, but it offers little or no component functionality. However, the language and its interpreter provide sufficient hooks to implement a thin, integral layer of component support. This functionality can be presented to the developer in the form of a module, making it very easy to use. This paper describes a Pythonmodule, PyBus, with which the conceptmore » of a ''software bus'' can be realized in Python. It demonstrates, within the context of the ATLAS software framework Athena, how PyBus can be used for the installation and (run-time) configuration of software, not necessarily Python modules, from a Python application in a way that is transparent to the end-user.« less
Endovascular Management of the Popliteal Artery: Comparison of Atherectomy and Angioplasty
Semaan, Elie; Hamburg, Naomi; Nasr, Wael; Shaw, Palma; Eberhardt, Robert; Woodson, Jonathan; Doros, Gheorghe; Rybin, Denis; Farber, Alik
2013-01-01
Purpose Symptomatic atherosclerotic disease of the popliteal artery presents challenges for endovascular therapy. We evaluated the technical success, complications and midterm outcomes of atherectomy and angioplasty involving the popliteal segment. Methods We conducted a retrospective review of outcomes of popliteal artery intervention using atherectomy or angioplasty performed between 2003 and 2008. Results A total of 56 patients (36% women, age 72.8±12.2 years, 77% critical limb ischemia) underwent popliteal atherectomy (n=18) or angioplasty (n=38). These patients had similar clinical characteristics, TASC/TASC II classification, mean lesion length, and run-off scores. We observed a trend toward higher rates of technical success defined as <30% residual stenosis after atherectomy compared to angioplasty (94% vs. 71%, p=0.08). While angioplasty was associated with a higher frequency of arterial dissection (23% vs. 0%, p=0.003), atherectomy was associated with a higher rate of thromboembolic events (22% vs 0%, p=0.01). Adjunctive stenting was used more frequently following angioplasty compared to atherectomy (45% vs. 6%, p=0.005). Thrombolysis was used to treat embolization in 4 patients in the atherectomy group. The improvement in the ankle-brachial index was similar between the two treatment groups. Primary patency of the popliteal artery at 3, 6, and 12 months was 94%, 88%, and 75% in the atherectomy group and 89%, 82%, and 73% in the angioplasty group (p=NS). There were no significant differences in limb salvage and freedom from reintervention at 1 year between the atherectomy and angioplasty groups. Conclusions Our experience with popliteal artery endovascular therapy indicates a distinct pattern of procedural complications with atherectomy compared to angioplasty but similar midterm patency, limb salvage and freedom from intervention. PMID:19942598
A conceptual model for megaprogramming
NASA Technical Reports Server (NTRS)
Tracz, Will
1990-01-01
Megaprogramming is component-based software engineering and life-cycle management. Magaprogramming and its relationship to other research initiatives (common prototyping system/common prototyping language, domain specific software architectures, and software understanding) are analyzed. The desirable attributes of megaprogramming software components are identified and a software development model and resulting prototype megaprogramming system (library interconnection language extended by annotated Ada) are described.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-01
... With Image Processing Systems, Components Thereof, and Associated Software; Notice of Investigation..., and associated software by reason of infringement of certain claims of U.S. Patent Nos. 7,043,087... processing systems, components thereof, and associated software that infringe one or more of claims 1, 6, and...
Support for life-cycle product reuse in NASA's SSE
NASA Technical Reports Server (NTRS)
Shotton, Charles
1989-01-01
The Software Support Environment (SSE) is a software factory for the production of Space Station Freedom Program operational software. The SSE is to be centrally developed and maintained and used to configure software production facilities in the field. The PRC product TTCQF provides for an automated qualification process and analysis of existing code that can be used for software reuse. The interrogation subsystem permits user queries of the reusable data and components which have been identified by an analyzer and qualified with associated metrics. The concept includes reuse of non-code life-cycle components such as requirements and designs. Possible types of reusable life-cycle components include templates, generics, and as-is items. Qualification of reusable elements requires analysis (separation of candidate components into primitives), qualification (evaluation of primitives for reusability according to reusability criteria) and loading (placing qualified elements into appropriate libraries). There can be different qualifications for different installations, methodologies, applications and components. Identifying reusable software and related components is labor-intensive and is best carried out as an integrated function of an SSE.
The software-cycle model for re-engineering and reuse
NASA Technical Reports Server (NTRS)
Bailey, John W.; Basili, Victor R.
1992-01-01
This paper reports on the progress of a study which will contribute to our ability to perform high-level, component-based programming by describing means to obtain useful components, methods for the configuration and integration of those components, and an underlying economic model of the costs and benefits associated with this approach to reuse. One goal of the study is to develop and demonstrate methods to recover reusable components from domain-specific software through a combination of tools, to perform the identification, extraction, and re-engineering of components, and domain experts, to direct the applications of those tools. A second goal of the study is to enable the reuse of those components by identifying techniques for configuring and recombining the re-engineered software. This component-recovery or software-cycle model addresses not only the selection and re-engineering of components, but also their recombination into new programs. Once a model of reuse activities has been developed, the quantification of the costs and benefits of various reuse options will enable the development of an adaptable economic model of reuse, which is the principal goal of the overall study. This paper reports on the conception of the software-cycle model and on several supporting techniques of software recovery, measurement, and reuse which will lead to the development of the desired economic model.
Component Technology for High-Performance Scientific Simulation Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Epperly, T; Kohn, S; Kumfert, G
2000-11-09
We are developing scientific software component technology to manage the complexity of modem, parallel simulation software and increase the interoperability and re-use of scientific software packages. In this paper, we describe a language interoperability tool named Babel that enables the creation and distribution of language-independent software libraries using interface definition language (IDL) techniques. We have created a scientific IDL that focuses on the unique interface description needs of scientific codes, such as complex numbers, dense multidimensional arrays, complicated data types, and parallelism. Preliminary results indicate that in addition to language interoperability, this approach provides useful tools for thinking about themore » design of modem object-oriented scientific software libraries. Finally, we also describe a web-based component repository called Alexandria that facilitates the distribution, documentation, and re-use of scientific components and libraries.« less
Software Component Technologies and Space Applications
NASA Technical Reports Server (NTRS)
Batory, Don
1995-01-01
In the near future, software systems will be more reconfigurable than hardware. This will be possible through the advent of software component technologies which have been prototyped in universities and research labs. In this paper, we outline the foundations for those technologies and suggest how they might impact software for space applications.
Challenges of the Open Source Component Marketplace in the Industry
NASA Astrophysics Data System (ADS)
Ayala, Claudia; Hauge, Øyvind; Conradi, Reidar; Franch, Xavier; Li, Jingyue; Velle, Ketil Sandanger
The reuse of Open Source Software components available on the Internet is playing a major role in the development of Component Based Software Systems. Nevertheless, the special nature of the OSS marketplace has taken the “classical” concept of software reuse based on centralized repositories to a completely different arena based on massive reuse over Internet. In this paper we provide an overview of the actual state of the OSS marketplace, and report preliminary findings about how companies interact with this marketplace to reuse OSS components. Such data was gathered from interviews in software companies in Spain and Norway. Based on these results we identify some challenges aimed to improve the industrial reuse of OSS components.
ResidPlots-2: Computer Software for IRT Graphical Residual Analyses
ERIC Educational Resources Information Center
Liang, Tie; Han, Kyung T.; Hambleton, Ronald K.
2009-01-01
This article discusses the ResidPlots-2, a computer software that provides a powerful tool for IRT graphical residual analyses. ResidPlots-2 consists of two components: a component for computing residual statistics and another component for communicating with users and for plotting the residual graphs. The features of the ResidPlots-2 software are…
Lo, Ming; Hue, Chih-Wei
2008-11-01
The Character-Component Analysis Toolkit (C-CAT) software was designed to assist researchers in constructing experimental materials using traditional Chinese characters. The software package contains two sets of character stocks: one suitable for research using literate adults as subjects and one suitable for research using schoolchildren as subjects. The software can identify linguistic properties, such as the number of strokes contained, the character-component pronunciation regularity, and the arrangement of character components within a character. Moreover, it can compute a character's linguistic frequency, neighborhood size, and phonetic validity with respect to a user-selected character stock. It can also search the selected character stock for similar characters or for character components with user-specified linguistic properties.
NASA Technical Reports Server (NTRS)
Yin, J.; Oyaki, A.; Hwang, C.; Hung, C.
2000-01-01
The purpose of this research and study paper is to provide a summary description and results of rapid development accomplishments at NASA/JPL in the area of advanced distributed computing technology using a Commercial-Off--The-Shelf (COTS)-based object oriented component approach to open inter-operable software development and software reuse.
NASA Technical Reports Server (NTRS)
Sundermier, Amy (Inventor)
2002-01-01
A method for acquiring and assembling software components at execution time into a client program, where the components may be acquired from remote networked servers is disclosed. The acquired components are assembled according to knowledge represented within one or more acquired mediating components. A mediating component implements knowledge of an object model. A mediating component uses its implemented object model knowledge, acquired component class information and polymorphism to assemble components into an interacting program at execution time. The interactions or abstract relationships between components in the object model may be implemented by the mediating component as direct invocations or indirect events or software bus exchanges. The acquired components may establish communications with remote servers. The acquired components may also present a user interface representing data to be exchanged with the remote servers. The mediating components may be assembled into layers, allowing arbitrarily complex programs to be constructed at execution time.
AdaNET Dynamic Software Inventory (DSI) prototype component acquisition plan
NASA Technical Reports Server (NTRS)
Hanley, Lionel
1989-01-01
A component acquisition plan contains the information needed to evaluate, select, and acquire software and hardware components necessary for successful completion of the AdaNET Dynamic Software Inventory (DSI) Management System Prototype. This plan will evolve and be applicable to all phases of the DSI prototype development. Resources, budgets, schedules, and organizations related to component acquisition activities are provided. A purpose and description of a software or hardware component which is to be acquired are presented. Since this is a plan for acquisition of all components, this section is not applicable. The procurement activities and events conducted by the acquirer are described and who is responsible is identified, where the activity will be performed, and when the activities will occur for each planned procurement. Acquisition requirements describe the specific requirements and standards to be followed during component acquisition. The activities which will take place during component acquisition are described. A list of abbreviations and acronyms, and a glossary are contained.
Managing Scientific Software Complexity with Bocca and CCA
Allan, Benjamin A.; Norris, Boyana; Elwasif, Wael R.; ...
2008-01-01
In high-performance scientific software development, the emphasis is often on short time to first solution. Even when the development of new components mostly reuses existing components or libraries and only small amounts of new code must be created, dealing with the component glue code and software build processes to obtain complete applications is still tedious and error-prone. Component-based software meant to reduce complexity at the application level increases complexity to the extent that the user must learn and remember the interfaces and conventions of the component model itself. To address these needs, we introduce Bocca, the first tool to enablemore » application developers to perform rapid component prototyping while maintaining robust software-engineering practices suitable to HPC environments. Bocca provides project management and a comprehensive build environment for creating and managing applications composed of Common Component Architecture components. Of critical importance for high-performance computing (HPC) applications, Bocca is designed to operate in a language-agnostic way, simultaneously handling components written in any of the languages commonly used in scientific applications: C, C++, Fortran, Python and Java. Bocca automates the tasks related to the component glue code, freeing the user to focus on the scientific aspects of the application. Bocca embraces the philosophy pioneered by Ruby on Rails for web applications: start with something that works, and evolve it to the user's purpose.« less
Component Prioritization Schema for Achieving Maximum Time and Cost Benefits from Software Testing
NASA Astrophysics Data System (ADS)
Srivastava, Praveen Ranjan; Pareek, Deepak
Software testing is any activity aimed at evaluating an attribute or capability of a program or system and determining that it meets its required results. Defining the end of software testing represents crucial features of any software development project. A premature release will involve risks like undetected bugs, cost of fixing faults later, and discontented customers. Any software organization would want to achieve maximum possible benefits from software testing with minimum resources. Testing time and cost need to be optimized for achieving a competitive edge in the market. In this paper, we propose a schema, called the Component Prioritization Schema (CPS), to achieve an effective and uniform prioritization of the software components. This schema serves as an extension to the Non Homogenous Poisson Process based Cumulative Priority Model. We also introduce an approach for handling time-intensive versus cost-intensive projects.
15 CFR Supplement No. 6 to Part 742 - Technical Questionnaire for Encryption Items
Code of Federal Regulations, 2012 CFR
2012-01-01
... software, provide the following information: (1) Description of all the symmetric and asymmetric encryption... third-party hardware or software encryption components (if any). Identify the manufacturers of the hardware or software components, including specific part numbers and version information as needed to...
15 CFR Supplement No. 6 to Part 742 - Technical Questionnaire for Encryption Items
Code of Federal Regulations, 2013 CFR
2013-01-01
... software, provide the following information: (1) Description of all the symmetric and asymmetric encryption... third-party hardware or software encryption components (if any). Identify the manufacturers of the hardware or software components, including specific part numbers and version information as needed to...
15 CFR Supplement No. 6 to Part 742 - Technical Questionnaire for Encryption Items
Code of Federal Regulations, 2014 CFR
2014-01-01
... software, provide the following information: (1) Description of all the symmetric and asymmetric encryption... third-party hardware or software encryption components (if any). Identify the manufacturers of the hardware or software components, including specific part numbers and version information as needed to...
System Testing of Ground Cooling System Components
NASA Technical Reports Server (NTRS)
Ensey, Tyler Steven
2014-01-01
This internship focused primarily upon software unit testing of Ground Cooling System (GCS) components, one of the three types of tests (unit, integrated, and COTS/regression) utilized in software verification. Unit tests are used to test the software of necessary components before it is implemented into the hardware. A unit test determines that the control data, usage procedures, and operating procedures of a particular component are tested to determine if the program is fit for use. Three different files are used to make and complete an efficient unit test. These files include the following: Model Test file (.mdl), Simulink SystemTest (.test), and autotest (.m). The Model Test file includes the component that is being tested with the appropriate Discrete Physical Interface (DPI) for testing. The Simulink SystemTest is a program used to test all of the requirements of the component. The autotest tests that the component passes Model Advisor and System Testing, and puts the results into proper files. Once unit testing is completed on the GCS components they can then be implemented into the GCS Schematic and the software of the GCS model as a whole can be tested using integrated testing. Unit testing is a critical part of software verification; it allows for the testing of more basic components before a model of higher fidelity is tested, making the process of testing flow in an orderly manner.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-09
..., Associated Software, and Products Containing the Same AGENCY: U.S. International Trade Commission. ACTION..., components thereof, associated software, and products containing the same by reason of infringement of..., components thereof, associated software, and products containing the same that infringe one or more of claims...
Achieving Better Buying Power through Acquisition of Open Architecture Software Systems: Volume 1
2016-01-06
supporting “Bring Your Own Devices” (BYOD)? 22 New business models for OA software components ● Franchising ● Enterprise licensing ● Metered usage...paths IP and cybersecurity requirements will need continuous attention! 35 New business models for OA software components ● Franchising ● Enterprise
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-06
... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-795] Certain Video Analytics Software... filed by ObjectVideo, Inc. of Reston, Virginia. 76 FR 45859 (Aug. 1, 2011). The complaint, as amended... certain video analytics software, systems, components thereof, and products containing same by reason of...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-21
... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-852] Certain Video Analytics Software..., 2012, based on a complaint filed by ObjectVideo, Inc. (``ObjectVideo'') of Reston, Virginia. 77 FR... United States after importation of certain video analytics software systems, components thereof, and...
NASA Technical Reports Server (NTRS)
Mayer, Richard J.; Blinn, Thomas M.; Mayer, Paula S. D.; Ackley, Keith A.; Crump, John W., IV; Henderson, Richard; Futrell, Michael T.
1991-01-01
The Framework Programmable Software Development Platform (FPP) is a project aimed at combining effective tool and data integration mechanisms with a model of the software development process in an intelligent integrated software environment. Guided by the model, this system development framework will take advantage of an integrated operating environment to automate effectively the management of the software development process so that costly mistakes during the development phase can be eliminated. The focus here is on the design of components that make up the FPP. These components serve as supporting systems for the Integration Mechanism and the Framework Processor and provide the 'glue' that ties the FPP together. Also discussed are the components that allow the platform to operate in a distributed, heterogeneous environment and to manage the development and evolution of software system artifacts.
Rhine, W R; Spaner, S D
1983-11-01
Following Anastasi and Thurstone, the factor structure of evaluative anxiety was examined among six groups of primary age boys and girls (N = 8064). A factor matching technique was used to study hypotheses about the effects of group differences in socioeconomic status (SES), ethnicity, and sex on the pattern of the children's responses to the Test Anxiety Scale for Children (TASC). Hypotheses about the congruence of factor patterns were based on both demographic differences and results of developmental research. The hypothesis of an SES X ethnicity X sex interaction was strongly supported. Implications for comparing factor structures, measuring evaluative anxiety, and future research of evaluative anxiety are discussed.
DigiSeis—A software component for digitizing seismic signals using the PC sound card
NASA Astrophysics Data System (ADS)
Amin Khan, Khalid; Akhter, Gulraiz; Ahmad, Zulfiqar
2012-06-01
An innovative software-based approach to develop an inexpensive experimental seismic recorder is presented. This approach requires no hardware as the built-in PC sound card is used for digitization of seismic signals. DigiSeis, an ActiveX component is developed to capture the digitized seismic signals from the sound card and deliver them to applications for processing and display. A seismic recorder application software SeisWave is developed over this component, which provides real-time monitoring and display of seismic events picked by a pair of external geophones. This recorder can be used as an educational aid for conducting seismic experiments. It can also be connected with suitable seismic sensors to record earthquakes. The software application and the ActiveX component are available for download. This component can be used to develop seismic recording applications according to user specific requirements.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gaenko, Alexander; Windus, Theresa L.; Sosonkina, Masha
2012-10-19
The design and development of scientific software components to provide an interface to the effective fragment potential (EFP) methods are reported. Multiscale modeling of physical and chemical phenomena demands the merging of software packages developed by research groups in significantly different fields. Componentization offers an efficient way to realize new high performance scientific methods by combining the best models available in different software packages without a need for package readaptation after the initial componentization is complete. The EFP method is an efficient electronic structure theory based model potential that is suitable for predictive modeling of intermolecular interactions in large molecularmore » systems, such as liquids, proteins, atmospheric aerosols, and nanoparticles, with an accuracy that is comparable to that of correlated ab initio methods. The developed components make the EFP functionality accessible for any scientific component-aware software package. The performance of the component is demonstrated on a protein interaction model, and its accuracy is compared with results obtained with coupled cluster methods.« less
Building Your Own Web Course: The Case for Off-the-Shelf Component Software.
ERIC Educational Resources Information Center
Kaplan, Howard
1998-01-01
Compares the features, advantages, and disadvantages of two major software options available for designing web courses: (1) component, off-the shelf software that allows for creation of audio slide lectures, course materials, discussion forums, animations, synchronous chat groups, quiz creators, and electronic mail, and (2) integrated packages…
Neural network-based retrieval from software reuse repositories
NASA Technical Reports Server (NTRS)
Eichmann, David A.; Srinivas, Kankanahalli
1992-01-01
A significant hurdle confronts the software reuser attempting to select candidate components from a software repository - discriminating between those components without resorting to inspection of the implementation(s). We outline an approach to this problem based upon neural networks which avoids requiring the repository administrators to define a conceptual closeness graph for the classification vocabulary.
Azar, Yara; DeRubertis, Brian; Baril, Donald; Woo, Karen
2018-05-01
Atherectomy has become an increasingly utilized modality for the endovascular treatment of peripheral arterial occlusive disease. The objective of this study was to determine the incidence and risk factors for atherectomy-associated complications. A retrospective review was performed for all atherectomy procedures performed between January 2011 and December 2015 in the Southern California Vascular Outcomes Improvement Collaborative. Atherectomy was defined as laser, orbital, or excisional atherectomy. Complications were dissection, perforation, and distal embolization. Seven hundred twenty-nine atherectomy procedures were performed at 7 institutions by 27 practitioners. The mean age was 73 years with 415 (57%) males. Four hundred nineteen (57%) were diabetic, 673 (92%) hypertensive, 457 (63%) smokers, and 244 (34%) had coronary artery disease. Dissection occurred in 51 (7%) procedures, embolization in 23 (3.1%), and perforation in 12 (1.6%). The mean number of lesions treated per artery was the same at 1.6 in patients with any complication and no complication (P = 0.77). The total occluded length was 7.4 cm for complications versus 7.2 cm for no complication (P = 0.73). The total treated length was 12.9 cm for complications versus 11.3 cm for no complication (P = 0.03). The incidence of complications for Trans-Atlantic Inter-Society Consensus (TASC) C/D lesions were 13% compared to 10% for TASC A/B lesions (P = 0.05). The incidence of complications in superficial femoral/popliteal lesions was 12.9% vs. 10.4% in tibial lesions (P = 0.13). In multivariable analysis, treatment length was associated with a small increased risk of complication (odds ratio = 1.02, 95% confidence interval = 1.0-1.04). Increased treatment length is associated with an increased risk of atherectomy-associated complications. Demographic factors and comorbidities were not predictors of complications. Copyright © 2017 Elsevier Inc. All rights reserved.
Kavaliauskienė, Zana; Benetis, Rimantas; Inčiūra, Donatas; Aleksynas, Nerijus; Kaupas, Rytis Stasys; Antuševas, Aleksandras
2014-01-01
The purpose of our study was to evaluate 1- and 2-year results and the influence of risk factors on the outcome in the patients undergoing iliac artery stenting for TASC II type B, C, and D iliac lesions. In this prospective nonrandomized study conducted between April 15, 2011, and April 15, 2013, 316 patients underwent angiography with a diagnosis of aortoiliac atherosclerotic disease. Of these, 62 iliac endovascular procedures (87 stents) were performed in 54 patients. The indications for revascularization were disabling claudication (Rutherford 2, 5.9%; Rutherford 3, 35.2%), rest pain (Rutherford 4, 22.2%), and gangrene (Rutherford 5, 16.7%). The overall complication rate was 9.2%. The cumulative primary stent patency at 1 and 2 years was 83.0%±5.2% and 79.9%±5.8%, respectively. Early stent thrombosis in ≤30 days was detected in two patients (3.7%). The primary patency rates for the stents ≤61mm at 12 and 24 months were 90.6%±4.5% and 86.6%±5.8%, respectively; those for the stents >61mm were 67.7%±10.9% and 60.2%±12.0%, respectively (P=0.016). The multivariate Cox regression analysis enabled the localization of a stent in both the CIA and the EIA (hazard ratio [HR], 3.3; 95% confidence interval [CI], 1.1-9.5; P=0.021) and poor runoff (HR, 3.2; 95%, CI 1.0-10.0; P=0.047) as independent predictors of decreased stent primary patency. The localization of a stent in both iliac (CIA and EIA) arteries and poor runoff significantly reduce the primary stent patency. Patients with stents >61mm have a higher risk of stent thrombosis or in-stent restenosis development. Copyright © 2014 Lithuanian University of Health Sciences. Production and hosting by Elsevier Urban & Partner Sp. z o.o. All rights reserved.
NASA Technical Reports Server (NTRS)
Lo, P. S.; Card, D.
1983-01-01
The Software Engineering Laboratory (SEL) Data Base Maintenance System (DBAM) is explained. The various software facilities of the SEL, DBAM operating procedures, and DBAM system information are described. The relationships among DBAM components (baseline diagrams), component descriptions, overlay descriptions, indirect command file listings, file definitions, and sample data collection forms are provided.
Application of Design Patterns in Refactoring Software Design
NASA Technical Reports Server (NTRS)
Baggs. Rjpda; Shaykhian, Gholam Ali
2007-01-01
Refactoring software design is a method of changing software design while explicitly preserving its unique design functionalities. Presented approach is to utilize design patterns as the basis for refactoring software design. Comparison of a design solution will be made through C++ programming language examples to exploit this approach. Developing reusable component will be discussed, the paper presents that the construction of such components can diminish the added burden of both refactoring and the use of design patterns.
Apply Design Patterns to Refactor Software Design
NASA Technical Reports Server (NTRS)
Baggs, Rhoda; Shaykhian, Gholam Ali
2007-01-01
Refactoring software design is a method of changing software design while explicitly preserving its unique design functionalities. Presented approach is to utilize design patterns as the basis for refactoring software design. Comparison of a design solution will be made through C++ programming language examples to exploit this approach. Developing reusable component will be discussed, the paper presents that the construction of such components can diminish the added burden of both refactoring and the use of design patterns.
RHCV Telescope System Operations Manual
2018-01-05
hardware and software components. Several of the components are closely coupled and rely on one-another, while others are largely independent. This...of hardware and software components. Several of the components are closely coupled and rely on one-another, while others are largely independent. This...attendant training The use cases are briefly described in separate sections, and step-by-step instructions are presented. Each section begins on a new
GERICOS: A Generic Framework for the Development of On-Board Software
NASA Astrophysics Data System (ADS)
Plasson, P.; Cuomo, C.; Gabriel, G.; Gauthier, N.; Gueguen, L.; Malac-Allain, L.
2016-08-01
This paper presents an overview of the GERICOS framework (GEneRIC Onboard Software), its architecture, its various layers and its future evolutions. The GERICOS framework, developed and qualified by LESIA, offers a set of generic, reusable and customizable software components for the rapid development of payload flight software. The GERICOS framework has a layered structure. The first layer (GERICOS::CORE) implements the concept of active objects and forms an abstraction layer over the top of real-time kernels. The second layer (GERICOS::BLOCKS) offers a set of reusable software components for building flight software based on generic solutions to recurrent functionalities. The third layer (GERICOS::DRIVERS) implements software drivers for several COTS IP cores of the LEON processor ecosystem.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-14
... Graphics Data Processing Systems, Components Thereof, and Associated Software; Institution of Investigation... associated software by reason of infringement of certain claims of U.S. Patent No. 5,945,997 (``the `997... software that infringe one or more of claims 1, 3-5, 9, and 16 of the `997 patent; claims 1, 5, and 9 of...
Hybrid Modeling for Testing Intelligent Software for Lunar-Mars Closed Life Support
NASA Technical Reports Server (NTRS)
Malin, Jane T.; Nicholson, Leonard S. (Technical Monitor)
1999-01-01
Intelligent software is being developed for closed life support systems with biological components, for human exploration of the Moon and Mars. The intelligent software functions include planning/scheduling, reactive discrete control and sequencing, management of continuous control, and fault detection, diagnosis, and management of failures and errors. Four types of modeling information have been essential to system modeling and simulation to develop and test the software and to provide operational model-based what-if analyses: discrete component operational and failure modes; continuous dynamic performance within component modes, modeled qualitatively or quantitatively; configuration of flows and power among components in the system; and operations activities and scenarios. CONFIG, a multi-purpose discrete event simulation tool that integrates all four types of models for use throughout the engineering and operations life cycle, has been used to model components and systems involved in the production and transfer of oxygen and carbon dioxide in a plant-growth chamber and between that chamber and a habitation chamber with physicochemical systems for gas processing.
Kenny, Joseph P.; Janssen, Curtis L.; Gordon, Mark S.; ...
2008-01-01
Cutting-edge scientific computing software is complex, increasingly involving the coupling of multiple packages to combine advanced algorithms or simulations at multiple physical scales. Component-based software engineering (CBSE) has been advanced as a technique for managing this complexity, and complex component applications have been created in the quantum chemistry domain, as well as several other simulation areas, using the component model advocated by the Common Component Architecture (CCA) Forum. While programming models do indeed enable sound software engineering practices, the selection of programming model is just one building block in a comprehensive approach to large-scale collaborative development which must also addressmore » interface and data standardization, and language and package interoperability. We provide an overview of the development approach utilized within the Quantum Chemistry Science Application Partnership, identifying design challenges, describing the techniques which we have adopted to address these challenges and highlighting the advantages which the CCA approach offers for collaborative development.« less
NA-42 TI Shared Software Component Library FY2011 Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Knudson, Christa K.; Rutz, Frederick C.; Dorow, Kevin E.
The NA-42 TI program initiated an effort in FY2010 to standardize its software development efforts with the long term goal of migrating toward a software management approach that will allow for the sharing and reuse of code developed within the TI program, improve integration, ensure a level of software documentation, and reduce development costs. The Pacific Northwest National Laboratory (PNNL) has been tasked with two activities that support this mission. PNNL has been tasked with the identification, selection, and implementation of a Shared Software Component Library. The intent of the library is to provide a common repository that is accessiblemore » by all authorized NA-42 software development teams. The repository facilitates software reuse through a searchable and easy to use web based interface. As software is submitted to the repository, the component registration process captures meta-data and provides version control for compiled libraries, documentation, and source code. This meta-data is then available for retrieval and review as part of library search results. In FY2010, PNNL and staff from the Remote Sensing Laboratory (RSL) teamed up to develop a software application with the goal of replacing the aging Aerial Measuring System (AMS). The application under development includes an Advanced Visualization and Integration of Data (AVID) framework and associated AMS modules. Throughout development, PNNL and RSL have utilized a common AMS code repository for collaborative code development. The AMS repository is hosted by PNNL, is restricted to the project development team, is accessed via two different geographic locations and continues to be used. The knowledge gained from the collaboration and hosting of this repository in conjunction with PNNL software development and systems engineering capabilities were used in the selection of a package to be used in the implementation of the software component library on behalf of NA-42 TI. The second task managed by PNNL is the development and continued maintenance of the NA-42 TI Software Development Questionnaire. This questionnaire is intended to help software development teams working under NA-42 TI in documenting their development activities. When sufficiently completed, the questionnaire illustrates that the software development activities recorded incorporate significant aspects of the software engineering lifecycle. The questionnaire template is updated as comments are received from NA-42 and/or its development teams and revised versions distributed to those using the questionnaire. PNNL also maintains a list of questionnaire recipients. The blank questionnaire template, the AVID and AMS software being developed, and the completed AVID AMS specific questionnaire are being used as the initial content to be established in the TI Component Library. This report summarizes the approach taken to identify requirements, search for and evaluate technologies, and the approach taken for installation of the software needed to host the component library. Additionally, it defines the process by which users request access for the contribution and retrieval of library content.« less
NASA Technical Reports Server (NTRS)
Mckay, Charles W.; Feagin, Terry; Bishop, Peter C.; Hallum, Cecil R.; Freedman, Glenn B.
1987-01-01
The principle focus of one of the RICIS (Research Institute for Computing and Information Systems) components is computer systems and software engineering in-the-large of the lifecycle of large, complex, distributed systems which: (1) evolve incrementally over a long time; (2) contain non-stop components; and (3) must simultaneously satisfy a prioritized balance of mission and safety critical requirements at run time. This focus is extremely important because of the contribution of the scaling direction problem to the current software crisis. The Computer Systems and Software Engineering (CSSE) component addresses the lifestyle issues of three environments: host, integration, and target.
Software development environments: Status and trends
NASA Technical Reports Server (NTRS)
Duffel, Larry E.
1988-01-01
Currently software engineers are the essential integrating factors tying several components together. The components consist of process, methods, computers, tools, support environments, and software engineers. The engineers today empower the tools versus the tools empowering the engineers. Some of the issues in software engineering are quality, managing the software engineering process, and productivity. A strategy to accomplish this is to promote the evolution of software engineering from an ad hoc, labor intensive activity to a managed, technology supported discipline. This strategy may be implemented by putting the process under management control, adopting appropriate methods, inserting the technology that provides automated support for the process and methods, collecting automated tools into an integrated environment and educating the personnel.
Component Models for Semantic Web Languages
NASA Astrophysics Data System (ADS)
Henriksson, Jakob; Aßmann, Uwe
Intelligent applications and agents on the Semantic Web typically need to be specified with, or interact with specifications written in, many different kinds of formal languages. Such languages include ontology languages, data and metadata query languages, as well as transformation languages. As learnt from years of experience in development of complex software systems, languages need to support some form of component-based development. Components enable higher software quality, better understanding and reusability of already developed artifacts. Any component approach contains an underlying component model, a description detailing what valid components are and how components can interact. With the multitude of languages developed for the Semantic Web, what are their underlying component models? Do we need to develop one for each language, or is a more general and reusable approach achievable? We present a language-driven component model specification approach. This means that a component model can be (automatically) generated from a given base language (actually, its specification, e.g. its grammar). As a consequence, we can provide components for different languages and simplify the development of software artifacts used on the Semantic Web.
Integrated Software Health Management for Aircraft GN and C
NASA Technical Reports Server (NTRS)
Schumann, Johann; Mengshoel, Ole
2011-01-01
Modern aircraft rely heavily on dependable operation of many safety-critical software components. Despite careful design, verification and validation (V&V), on-board software can fail with disastrous consequences if it encounters problematic software/hardware interaction or must operate in an unexpected environment. We are using a Bayesian approach to monitor the software and its behavior during operation and provide up-to-date information about the health of the software and its components. The powerful reasoning mechanism provided by our model-based Bayesian approach makes reliable diagnosis of the root causes possible and minimizes the number of false alarms. Compilation of the Bayesian model into compact arithmetic circuits makes SWHM feasible even on platforms with limited CPU power. We show initial results of SWHM on a small simulator of an embedded aircraft software system, where software and sensor faults can be injected.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-10
... Certain GPS Navigation Products, Components Thereof, and Related Software, DN 2814; the Commission is... importation of certain GPS navigation products, components thereof, and related software. The complaint names...
NASA Technical Reports Server (NTRS)
Briand, Lionel C.; Basili, Victor R.; Hetmanski, Christopher J.
1993-01-01
Applying equal testing and verification effort to all parts of a software system is not very efficient, especially when resources are limited and scheduling is tight. Therefore, one needs to be able to differentiate low/high fault frequency components so that testing/verification effort can be concentrated where needed. Such a strategy is expected to detect more faults and thus improve the resulting reliability of the overall system. This paper presents the Optimized Set Reduction approach for constructing such models, intended to fulfill specific software engineering needs. Our approach to classification is to measure the software system and build multivariate stochastic models for predicting high risk system components. We present experimental results obtained by classifying Ada components into two classes: is or is not likely to generate faults during system and acceptance test. Also, we evaluate the accuracy of the model and the insights it provides into the error making process.
Contingency theoretic methodology for agent-based web-oriented manufacturing systems
NASA Astrophysics Data System (ADS)
Durrett, John R.; Burnell, Lisa J.; Priest, John W.
2000-12-01
The development of distributed, agent-based, web-oriented, N-tier Information Systems (IS) must be supported by a design methodology capable of responding to the convergence of shifts in business process design, organizational structure, computing, and telecommunications infrastructures. We introduce a contingency theoretic model for the use of open, ubiquitous software infrastructure in the design of flexible organizational IS. Our basic premise is that developers should change in the way they view the software design process from a view toward the solution of a problem to one of the dynamic creation of teams of software components. We postulate that developing effective, efficient, flexible, component-based distributed software requires reconceptualizing the current development model. The basic concepts of distributed software design are merged with the environment-causes-structure relationship from contingency theory; the task-uncertainty of organizational- information-processing relationships from information processing theory; and the concept of inter-process dependencies from coordination theory. Software processes are considered as employees, groups of processes as software teams, and distributed systems as software organizations. Design techniques already used in the design of flexible business processes and well researched in the domain of the organizational sciences are presented. Guidelines that can be utilized in the creation of component-based distributed software will be discussed.
NASA Technical Reports Server (NTRS)
Hanley, Lionel
1989-01-01
The Ada Software Repository is a public-domain collection of Ada software and information. The Ada Software Repository is one of several repositories located on the SIMTEL20 Defense Data Network host computer at White Sands Missile Range, and available to any host computer on the network since 26 November 1984. This repository provides a free source for Ada programs and information. The Ada Software Repository is divided into several subdirectories. These directories are organized by topic, and their names and a brief overview of their topics are contained. The Ada Software Repository on SIMTEL20 serves two basic roles: to promote the exchange and use (reusability) of Ada programs and tools (including components) and to promote Ada education.
Component-specific modeling. [jet engine hot section components
NASA Technical Reports Server (NTRS)
Mcknight, R. L.; Maffeo, R. J.; Tipton, M. T.; Weber, G.
1992-01-01
Accomplishments are described for a 3 year program to develop methodology for component-specific modeling of aircraft hot section components (turbine blades, turbine vanes, and burner liners). These accomplishments include: (1) engine thermodynamic and mission models, (2) geometry model generators, (3) remeshing, (4) specialty three-dimensional inelastic structural analysis, (5) computationally efficient solvers, (6) adaptive solution strategies, (7) engine performance parameters/component response variables decomposition and synthesis, (8) integrated software architecture and development, and (9) validation cases for software developed.
78 FR 1162 - Cardiovascular Devices; Reclassification of External Cardiac Compressor
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-08
... safety and electromagnetic compatibility; For devices containing software, software verification... electromagnetic compatibility; For devices containing software, software verification, validation, and hazard... electrical components, appropriate analysis and testing must validate electrical safety and electromagnetic...
77 FR 40082 - Certain Gaming and Entertainment Consoles, Related Software, and Components Thereof...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-06
... Commission remands for the ALJ to (1) apply the Commission's opinion in Certain Electronic Devices With Image Processing Systems, Components Thereof, and Associated Software, Inv. No. 337-TA-724, Comm'n Op. (Dec. 21...
SEPAC flight software detailed design specifications, volume 1
NASA Technical Reports Server (NTRS)
1982-01-01
The detailed design specifications (as built) for the SEPAC Flight Software are defined. The design includes a description of the total software system and of each individual module within the system. The design specifications describe the decomposition of the software system into its major components. The system structure is expressed in the following forms: the control-flow hierarchy of the system, the data-flow structure of the system, the task hierarchy, the memory structure, and the software to hardware configuration mapping. The component design description includes details on the following elements: register conventions, module (subroutines) invocaton, module functions, interrupt servicing, data definitions, and database structure.
Fault Tree Analysis Application for Safety and Reliability
NASA Technical Reports Server (NTRS)
Wallace, Dolores R.
2003-01-01
Many commercial software tools exist for fault tree analysis (FTA), an accepted method for mitigating risk in systems. The method embedded in the tools identifies a root as use in system components, but when software is identified as a root cause, it does not build trees into the software component. No commercial software tools have been built specifically for development and analysis of software fault trees. Research indicates that the methods of FTA could be applied to software, but the method is not practical without automated tool support. With appropriate automated tool support, software fault tree analysis (SFTA) may be a practical technique for identifying the underlying cause of software faults that may lead to critical system failures. We strive to demonstrate that existing commercial tools for FTA can be adapted for use with SFTA, and that applied to a safety-critical system, SFTA can be used to identify serious potential problems long before integrator and system testing.
Rosenthal, L E
1986-10-01
Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.
Quadratic Blind Linear Unmixing: A Graphical User Interface for Tissue Characterization
Gutierrez-Navarro, O.; Campos-Delgado, D.U.; Arce-Santana, E. R.; Jo, Javier A.
2016-01-01
Spectral unmixing is the process of breaking down data from a sample into its basic components and their abundances. Previous work has been focused on blind unmixing of multi-spectral fluorescence lifetime imaging microscopy (m-FLIM) datasets under a linear mixture model and quadratic approximations. This method provides a fast linear decomposition and can work without a limitation in the maximum number of components or end-members. Hence this work presents an interactive software which implements our blind end-member and abundance extraction (BEAE) and quadratic blind linear unmixing (QBLU) algorithms in Matlab. The options and capabilities of our proposed software are described in detail. When the number of components is known, our software can estimate the constitutive end-members and their abundances. When no prior knowledge is available, the software can provide a completely blind solution to estimate the number of components, the end-members and their abundances. The characterization of three case studies validates the performance of the new software: ex-vivo human coronary arteries, human breast cancer cell samples, and in-vivo hamster oral mucosa. The software is freely available in a hosted webpage by one of the developing institutions, and allows the user a quick, easy-to-use and efficient tool for multi/hyper-spectral data decomposition. PMID:26589467
Quadratic blind linear unmixing: A graphical user interface for tissue characterization.
Gutierrez-Navarro, O; Campos-Delgado, D U; Arce-Santana, E R; Jo, Javier A
2016-02-01
Spectral unmixing is the process of breaking down data from a sample into its basic components and their abundances. Previous work has been focused on blind unmixing of multi-spectral fluorescence lifetime imaging microscopy (m-FLIM) datasets under a linear mixture model and quadratic approximations. This method provides a fast linear decomposition and can work without a limitation in the maximum number of components or end-members. Hence this work presents an interactive software which implements our blind end-member and abundance extraction (BEAE) and quadratic blind linear unmixing (QBLU) algorithms in Matlab. The options and capabilities of our proposed software are described in detail. When the number of components is known, our software can estimate the constitutive end-members and their abundances. When no prior knowledge is available, the software can provide a completely blind solution to estimate the number of components, the end-members and their abundances. The characterization of three case studies validates the performance of the new software: ex-vivo human coronary arteries, human breast cancer cell samples, and in-vivo hamster oral mucosa. The software is freely available in a hosted webpage by one of the developing institutions, and allows the user a quick, easy-to-use and efficient tool for multi/hyper-spectral data decomposition. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
The image-guided surgery toolkit IGSTK: an open source C++ software toolkit.
Enquobahrie, Andinet; Cheng, Patrick; Gary, Kevin; Ibanez, Luis; Gobbi, David; Lindseth, Frank; Yaniv, Ziv; Aylward, Stephen; Jomier, Julien; Cleary, Kevin
2007-11-01
This paper presents an overview of the image-guided surgery toolkit (IGSTK). IGSTK is an open source C++ software library that provides the basic components needed to develop image-guided surgery applications. It is intended for fast prototyping and development of image-guided surgery applications. The toolkit was developed through a collaboration between academic and industry partners. Because IGSTK was designed for safety-critical applications, the development team has adopted lightweight software processes that emphasizes safety and robustness while, at the same time, supporting geographically separated developers. A software process that is philosophically similar to agile software methods was adopted emphasizing iterative, incremental, and test-driven development principles. The guiding principle in the architecture design of IGSTK is patient safety. The IGSTK team implemented a component-based architecture and used state machine software design methodologies to improve the reliability and safety of the components. Every IGSTK component has a well-defined set of features that are governed by state machines. The state machine ensures that the component is always in a valid state and that all state transitions are valid and meaningful. Realizing that the continued success and viability of an open source toolkit depends on a strong user community, the IGSTK team is following several key strategies to build an active user community. These include maintaining a users and developers' mailing list, providing documentation (application programming interface reference document and book), presenting demonstration applications, and delivering tutorial sessions at relevant scientific conferences.
Documenting Models for Interoperability and Reusability ...
Many modeling frameworks compartmentalize science via individual models that link sets of small components to create larger modeling workflows. Developing integrated watershed models increasingly requires coupling multidisciplinary, independent models, as well as collaboration between scientific communities, since component-based modeling can integrate models from different disciplines. Integrated Environmental Modeling (IEM) systems focus on transferring information between components by capturing a conceptual site model; establishing local metadata standards for input/output of models and databases; managing data flow between models and throughout the system; facilitating quality control of data exchanges (e.g., checking units, unit conversions, transfers between software languages); warning and error handling; and coordinating sensitivity/uncertainty analyses. Although many computational software systems facilitate communication between, and execution of, components, there are no common approaches, protocols, or standards for turn-key linkages between software systems and models, especially if modifying components is not the intent. Using a standard ontology, this paper reviews how models can be described for discovery, understanding, evaluation, access, and implementation to facilitate interoperability and reusability. In the proceedings of the International Environmental Modelling and Software Society (iEMSs), 8th International Congress on Environmental Mod
Wojdyla, Justyna Aleksandra; Kaminski, Jakub W; Panepucci, Ezequiel; Ebner, Simon; Wang, Xiaoqiang; Gabadinho, Jose; Wang, Meitian
2018-01-01
Data acquisition software is an essential component of modern macromolecular crystallography (MX) beamlines, enabling efficient use of beam time at synchrotron facilities. Developed at the Paul Scherrer Institute, the DA+ data acquisition software is implemented at all three Swiss Light Source (SLS) MX beamlines. DA+ consists of distributed services and components written in Python and Java, which communicate via messaging and streaming technologies. The major components of DA+ are the user interface, acquisition engine, online processing and database. Immediate data quality feedback is achieved with distributed automatic data analysis routines. The software architecture enables exploration of the full potential of the latest instrumentation at the SLS MX beamlines, such as the SmarGon goniometer and the EIGER X 16M detector, and development of new data collection methods.
Implications of Responsive Space on the Flight Software Architecture
NASA Technical Reports Server (NTRS)
Wilmot, Jonathan
2006-01-01
The Responsive Space initiative has several implications for flight software that need to be addressed not only within the run-time element, but the development infrastructure and software life-cycle process elements as well. The runtime element must at a minimum support Plug & Play, while the development and process elements need to incorporate methods to quickly generate the needed documentation, code, tests, and all of the artifacts required of flight quality software. Very rapid response times go even further, and imply little or no new software development, requiring instead, using only predeveloped and certified software modules that can be integrated and tested through automated methods. These elements have typically been addressed individually with significant benefits, but it is when they are combined that they can have the greatest impact to Responsive Space. The Flight Software Branch at NASA's Goddard Space Flight Center has been developing the runtime, infrastructure and process elements needed for rapid integration with the Core Flight software System (CFS) architecture. The CFS architecture consists of three main components; the core Flight Executive (cFE), the component catalog, and the Integrated Development Environment (DE). This paper will discuss the design of the components, how they facilitate rapid integration, and lessons learned as the architecture is utilized for an upcoming spacecraft.
Instrument control software development process for the multi-star AO system ARGOS
NASA Astrophysics Data System (ADS)
Kulas, M.; Barl, L.; Borelli, J. L.; Gässler, W.; Rabien, S.
2012-09-01
The ARGOS project (Advanced Rayleigh guided Ground layer adaptive Optics System) will upgrade the Large Binocular Telescope (LBT) with an AO System consisting of six Rayleigh laser guide stars. This adaptive optics system integrates several control loops and many different components like lasers, calibration swing arms and slope computers that are dispersed throughout the telescope. The purpose of the instrument control software (ICS) is running this AO system and providing convenient client interfaces to the instruments and the control loops. The challenges for the ARGOS ICS are the development of a distributed and safety-critical software system with no defects in a short time, the creation of huge and complex software programs with a maintainable code base, the delivery of software components with the desired functionality and the support of geographically distributed project partners. To tackle these difficult tasks, the ARGOS software engineers reuse existing software like the novel middleware from LINC-NIRVANA, an instrument for the LBT, provide many tests at different functional levels like unit tests and regression tests, agree about code and architecture style and deliver software incrementally while closely collaborating with the project partners. Many ARGOS ICS components are already successfully in use in the laboratories for testing ARGOS control loops.
An expert system based software sizing tool, phase 2
NASA Technical Reports Server (NTRS)
Friedlander, David
1990-01-01
A software tool was developed for predicting the size of a future computer program at an early stage in its development. The system is intended to enable a user who is not expert in Software Engineering to estimate software size in lines of source code with an accuracy similar to that of an expert, based on the program's functional specifications. The project was planned as a knowledge based system with a field prototype as the goal of Phase 2 and a commercial system planned for Phase 3. The researchers used techniques from Artificial Intelligence and knowledge from human experts and existing software from NASA's COSMIC database. They devised a classification scheme for the software specifications, and a small set of generic software components that represent complexity and apply to large classes of programs. The specifications are converted to generic components by a set of rules and the generic components are input to a nonlinear sizing function which makes the final prediction. The system developed for this project predicted code sizes from the database with a bias factor of 1.06 and a fluctuation factor of 1.77, an accuracy similar to that of human experts but without their significant optimistic bias.
Langer, Dominik; van 't Hoff, Marcel; Keller, Andreas J; Nagaraja, Chetan; Pfäffli, Oliver A; Göldi, Maurice; Kasper, Hansjörg; Helmchen, Fritjof
2013-04-30
Intravital microscopy such as in vivo imaging of brain dynamics is often performed with custom-built microscope setups controlled by custom-written software to meet specific requirements. Continuous technological advancement in the field has created a need for new control software that is flexible enough to support the biological researcher with innovative imaging techniques and provide the developer with a solid platform for quickly and easily implementing new extensions. Here, we introduce HelioScan, a software package written in LabVIEW, as a platform serving this dual role. HelioScan is designed as a collection of components that can be flexibly assembled into microscope control software tailored to the particular hardware and functionality requirements. Moreover, HelioScan provides a software framework, within which new functionality can be implemented in a quick and structured manner. A specific HelioScan application assembles at run-time from individual software components, based on user-definable configuration files. Due to its component-based architecture, HelioScan can exploit synergies of multiple developers working in parallel on different components in a community effort. We exemplify the capabilities and versatility of HelioScan by demonstrating several in vivo brain imaging modes, including camera-based intrinsic optical signal imaging for functional mapping of cortical areas, standard two-photon laser-scanning microscopy using galvanometric mirrors, and high-speed in vivo two-photon calcium imaging using either acousto-optic deflectors or a resonant scanner. We recommend HelioScan as a convenient software framework for the in vivo imaging community. Copyright © 2013 Elsevier B.V. All rights reserved.
Software Engineering Laboratory (SEL) programmer workbench phase 1 evaluation
NASA Technical Reports Server (NTRS)
1981-01-01
Phase 1 of the SEL programmer workbench consists of the design of the following three components: communications link, command language processor, and collection of software aids. A brief description, and evaluation, and recommendations are presented for each of these three components.
NASA Astrophysics Data System (ADS)
van Gend, Carel; Lombaard, Briehan; Sickafoose, Amanda; Whittal, Hamish
2016-07-01
Until recently, software for instruments on the smaller telescopes at the South African Astronomical Observatory (SAAO) has not been designed for remote accessibility and frequently has not been developed using modern software best-practice. We describe a software architecture we have implemented for use with new and upgraded instruments at the SAAO. The architecture was designed to allow for multiple components and to be fast, reliable, remotely- operable, support different user interfaces, employ as much non-proprietary software as possible, and to take future-proofing into consideration. Individual component drivers exist as standalone processes, communicating over a network. A controller layer coordinates the various components, and allows a variety of user interfaces to be used. The Sutherland High-speed Optical Cameras (SHOC) instruments incorporate an Andor electron-multiplying CCD camera, a GPS unit for accurate timing and a pair of filter wheels. We have applied the new architecture to the SHOC instruments, with the camera driver developed using Andor's software development kit. We have used this to develop an innovative web-based user-interface to the instrument.
NASA Technical Reports Server (NTRS)
Clancey, William J.; Lowry, Michael R.; Nado, Robert Allen; Sierhuis, Maarten
2011-01-01
We analyzed a series of ten systematically developed surface exploration systems that integrated a variety of hardware and software components. Design, development, and testing data suggest that incremental buildup of an exploration system for long-duration capabilities is facilitated by an open architecture with appropriate-level APIs, specifically designed to facilitate integration of new components. This improves software productivity by reducing changes required for reconfiguring an existing system.
Component Verification and Certification in NASA Missions
NASA Technical Reports Server (NTRS)
Giannakopoulou, Dimitra; Penix, John; Norvig, Peter (Technical Monitor)
2001-01-01
Software development for NASA missions is a particularly challenging task. Missions are extremely ambitious scientifically, have very strict time frames, and must be accomplished with a maximum degree of reliability. Verification technologies must therefore be pushed far beyond their current capabilities. Moreover, reuse and adaptation of software architectures and components must be incorporated in software development within and across missions. This paper discusses NASA applications that we are currently investigating from these perspectives.
Automated Reuse of Scientific Subroutine Libraries through Deductive Synthesis
NASA Technical Reports Server (NTRS)
Lowry, Michael R.; Pressburger, Thomas; VanBaalen, Jeffrey; Roach, Steven
1997-01-01
Systematic software construction offers the potential of elevating software engineering from an art-form to an engineering discipline. The desired result is more predictable software development leading to better quality and more maintainable software. However, the overhead costs associated with the formalisms, mathematics, and methods of systematic software construction have largely precluded their adoption in real-world software development. In fact, many mainstream software development organizations, such as Microsoft, still maintain a predominantly oral culture for software development projects; which is far removed from a formalism-based culture for software development. An exception is the limited domain of safety-critical software, where the high-assuiance inherent in systematic software construction justifies the additional cost. We believe that systematic software construction will only be adopted by mainstream software development organization when the overhead costs have been greatly reduced. Two approaches to cost mitigation are reuse (amortizing costs over many applications) and automation. For the last four years, NASA Ames has funded the Amphion project, whose objective is to automate software reuse through techniques from systematic software construction. In particular, deductive program synthesis (i.e., program extraction from proofs) is used to derive a composition of software components (e.g., subroutines) that correctly implements a specification. The construction of reuse libraries of software components is the standard software engineering solution for improving software development productivity and quality.
A system for automatic evaluation of simulation software
NASA Technical Reports Server (NTRS)
Ryan, J. P.; Hodges, B. C.
1976-01-01
Within the field of computer software, simulation and verification are complementary processes. Simulation methods can be used to verify software by performing variable range analysis. More general verification procedures, such as those described in this paper, can be implicitly, viewed as attempts at modeling the end-product software. From software requirement methodology, each component of the verification system has some element of simulation to it. Conversely, general verification procedures can be used to analyze simulation software. A dynamic analyzer is described which can be used to obtain properly scaled variables for an analog simulation, which is first digitally simulated. In a similar way, it is thought that the other system components and indeed the whole system itself have the potential of being effectively used in a simulation environment.
NASA Astrophysics Data System (ADS)
Syafiqah Syahirah Mohamed, Nor; Amalina Banu Mohamat Adek, Noor; Hamid, Nurul Farhana Abd
2018-03-01
This paper presents the development of Graphical User Interface (GUI) software for sizing main component in AC coupled photovoltaic (PV) hybrid power system based on Malaysia climate. This software provides guideline for PV system integrator to design effectively the size of components and system configuration to match the system and load requirement with geographical condition. The concept of the proposed software is balancing the annual average renewable energy generation and load demand. In this study, the PV to diesel generator (DG) ratio is introduced by considering the hybrid system energy contribution. The GUI software is able to size the main components in the PV hybrid system to meet with the set target of energy contribution ratio. The rated powers of the components to be defined are PV array, grid-tie inverter, bi-directional inverter, battery storage and DG. GUI is used to perform all the system sizing procedures to make it user friendly interface as a sizing tool for AC coupled PV hybrid system. The GUI will be done by using Visual Studio 2015 based on the real data under Malaysia Climate.
Interface Generation and Compositional Verification in JavaPathfinder
NASA Technical Reports Server (NTRS)
Giannakopoulou, Dimitra; Pasareanu, Corina
2009-01-01
We present a novel algorithm for interface generation of software components. Given a component, our algorithm uses learning techniques to compute a permissive interface representing legal usage of the component. Unlike our previous work, this algorithm does not require knowledge about the component s environment. Furthermore, in contrast to other related approaches, our algorithm computes permissive interfaces even in the presence of non-determinism in the component. Our algorithm is implemented in the JavaPathfinder model checking framework for UML statechart components. We have also added support for automated assume-guarantee style compositional verification in JavaPathfinder, using component interfaces. We report on the application of the presented approach to the generation of interfaces for flight software components.
An overview of the mathematical and statistical analysis component of RICIS
NASA Technical Reports Server (NTRS)
Hallum, Cecil R.
1987-01-01
Mathematical and statistical analysis components of RICIS (Research Institute for Computing and Information Systems) can be used in the following problem areas: (1) quantification and measurement of software reliability; (2) assessment of changes in software reliability over time (reliability growth); (3) analysis of software-failure data; and (4) decision logic for whether to continue or stop testing software. Other areas of interest to NASA/JSC where mathematical and statistical analysis can be successfully employed include: math modeling of physical systems, simulation, statistical data reduction, evaluation methods, optimization, algorithm development, and mathematical methods in signal processing.
NASA Technical Reports Server (NTRS)
Mcknight, R. L.
1985-01-01
Accomplishments are described for the second year effort of a 3-year program to develop methodology for component specific modeling of aircraft engine hot section components (turbine blades, turbine vanes, and burner liners). These accomplishments include: (1) engine thermodynamic and mission models; (2) geometry model generators; (3) remeshing; (4) specialty 3-D inelastic stuctural analysis; (5) computationally efficient solvers, (6) adaptive solution strategies; (7) engine performance parameters/component response variables decomposition and synthesis; (8) integrated software architecture and development, and (9) validation cases for software developed.
Design of a nickel-hydrogen battery simulator for the NASA EOS testbed
NASA Technical Reports Server (NTRS)
Gur, Zvi; Mang, Xuesi; Patil, Ashok R.; Sable, Dan M.; Cho, Bo H.; Lee, Fred C.
1992-01-01
The hardware and software design of a nickel-hydrogen (Ni-H2) battery simulator (BS) with application to the NASA Earth Observation System (EOS) satellite is presented. The battery simulator is developed as a part of a complete testbed for the EOS satellite power system. The battery simulator involves both hardware and software components. The hardware component includes the capability of sourcing and sinking current at a constant programmable voltage. The software component includes the capability of monitoring the battery's ampere-hours (Ah) and programming the battery voltage according to an empirical model of the nickel-hydrogen battery stored in a computer.
Effective Software Engineering Leadership for Development Programs
ERIC Educational Resources Information Center
Cagle West, Marsha
2010-01-01
Software is a critical component of systems ranging from simple consumer appliances to complex health, nuclear, and flight control systems. The development of quality, reliable, and effective software solutions requires the incorporation of effective software engineering processes and leadership. Processes, approaches, and methodologies for…
Code of Federal Regulations, 2012 CFR
2012-10-01
... cohesion. Component means an electronic element, device, or appliance (including hardware or software) that... and software version, is documented and maintained through the life-cycle of the products in use. Executive software means software common to all installations of a given electronic product. It generally is...
Code of Federal Regulations, 2013 CFR
2013-10-01
... cohesion. Component means an electronic element, device, or appliance (including hardware or software) that... and software version, is documented and maintained through the life-cycle of the products in use. Executive software means software common to all installations of a given electronic product. It generally is...
Code of Federal Regulations, 2014 CFR
2014-10-01
... cohesion. Component means an electronic element, device, or appliance (including hardware or software) that... and software version, is documented and maintained through the life-cycle of the products in use. Executive software means software common to all installations of a given electronic product. It generally is...
Effectiveness of back-to-back testing
NASA Technical Reports Server (NTRS)
Vouk, Mladen A.; Mcallister, David F.; Eckhardt, David E.; Caglayan, Alper; Kelly, John P. J.
1987-01-01
Three models of back-to-back testing processes are described. Two models treat the case where there is no intercomponent failure dependence. The third model describes the more realistic case where there is correlation among the failure probabilities of the functionally equivalent components. The theory indicates that back-to-back testing can, under the right conditions, provide a considerable gain in software reliability. The models are used to analyze the data obtained in a fault-tolerant software experiment. It is shown that the expected gain is indeed achieved, and exceeded, provided the intercomponent failure dependence is sufficiently small. However, even with the relatively high correlation the use of several functionally equivalent components coupled with back-to-back testing may provide a considerable reliability gain. Implications of this finding are that the multiversion software development is a feasible and cost effective approach to providing highly reliable software components intended for fault-tolerant software systems, on condition that special attention is directed at early detection and elimination of correlated faults.
Automated Software Development Workstation (ASDW)
NASA Technical Reports Server (NTRS)
Fridge, Ernie
1990-01-01
Software development is a serious bottleneck in the construction of complex automated systems. An increase of the reuse of software designs and components has been viewed as a way to relieve this bottleneck. One approach to achieving software reusability is through the development and use of software parts composition systems. A software parts composition system is a software development environment comprised of a parts description language for modeling parts and their interfaces, a catalog of existing parts, a composition editor that aids a user in the specification of a new application from existing parts, and a code generator that takes a specification and generates an implementation of a new application in a target language. The Automated Software Development Workstation (ASDW) is an expert system shell that provides the capabilities required to develop and manipulate these software parts composition systems. The ASDW is now in Beta testing at the Johnson Space Center. Future work centers on responding to user feedback for capability and usability enhancement, expanding the scope of the software lifecycle that is covered, and in providing solutions to handling very large libraries of reusable components.
Sultan, S; Hynes, N
2014-12-01
Patients with end-stage critical limb ischemia (CLI) survive on borrowed time and amputation is inevitable if an aggressive management stratagem is not instigated. Our primary aim was to equate effectiveness of subintimal angioplasty (SIA) and tibial balloon angioplasty (TBA) in sustaining clinical improvement and amputation free survival (AFS) in patients with CLI TASD II D. Moreover, patients with severe CLI, who were not suitable for revascularization and who were offered therapy with a sequential compression biomechanical device (SCBD) were scrutinised as part of a comprehensive lower limb salvage program. From 2002-2012, 5876 patients were referred with peripheral vascular disease (PVD); 987 presented with CLI and 798 had intervention; 189 patients presenting with CLI were not candidates for revascularisation, out of which 171 were offered SCBD. We formed a prospective observational group study of 441 patient who had TASC D disease. All of these patients presented as emergencies and were allocated to the next available treatment list. Duplex ultrasound arterial mapping (DUAM) was the sole preoperative investigation tool in 92% of all cases. Of the 441 patients studied, 190 patients (206 procedures) has SIA for TASC D femero-popliteal occlusions, 80 patients (89 procedures) had TBA and cool eximer laser angioplasty (CELA) for tibial artery occlusions and 171 patients with severe CLI were not suitable for revascularization and joined the SCBD program. Mean age (SIA 73±13 years vs. TBA/CELA 74±8 years vs. SCBD 75±13 years), and comorbidity severity scores (P>0.05) were similar between groups. Perioperative mortality within the SIA group was 1.6% vs. 0% within the TBA group and 0.6% in SCBD. Length of hospital stay within the TBA group was 3.8±2 days vs. SIA 14±16 days, P<0.0001. The 5-year freedom from major adverse events (MAE) for the SIA group was 68% that was comparable to the results obtained for both the TBA group; 59%, and SCBD group: 62.5% (P=0.1935). Five-year freedom from target lesion revascularization was 85.9% within the SIA group and 79% within the TBA group. A sustained clinical improvement was seen in 82.8% of primary SIA and 68% of TBA, which mimics the outcome of SCBD at 68% at one year. A total of 83% SCBD patients had no rest pain within one week of starting the program and gangrene remained dry and non-progressive. Ulceration healed in all but 12 patients. There were no device-related complications. Limb salvage was 94% at 5 years. All-cause survival was 69%. Quality time spent without symptoms of disease or toxicity of treatment (Q-TWiST) was 24.7 months for SIA and 8.5 months for TBA and was 38.13 for SCBD for a total of 708 months of usage. Cost per quality adjusted-life years (QALY) for SIA was € 5662.79, € 12,935.18 for TBA and € 2943.56 for SCBD. All treatment pathways augmented patient-specific Q-TWiST with substantial cost reduction. SIA, TBA and SCBD expand AFS and symptom-free survival. All treatment modalities are minimally invasive and allow for a high patient turnover without compromising limb salvage, once they are performed by experienced vascular surgeons in high deliberate practice volume centers.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-23
... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-752] Certain Gaming and Entertainment Consoles, Related Software, and Components Thereof; Notice of Commission Determination To Review a Final Initial Determination Finding a Violation of Section 337; Remand of the Investigation to the...
Wojdyla, Justyna Aleksandra; Kaminski, Jakub W.; Ebner, Simon; Wang, Xiaoqiang; Gabadinho, Jose; Wang, Meitian
2018-01-01
Data acquisition software is an essential component of modern macromolecular crystallography (MX) beamlines, enabling efficient use of beam time at synchrotron facilities. Developed at the Paul Scherrer Institute, the DA+ data acquisition software is implemented at all three Swiss Light Source (SLS) MX beamlines. DA+ consists of distributed services and components written in Python and Java, which communicate via messaging and streaming technologies. The major components of DA+ are the user interface, acquisition engine, online processing and database. Immediate data quality feedback is achieved with distributed automatic data analysis routines. The software architecture enables exploration of the full potential of the latest instrumentation at the SLS MX beamlines, such as the SmarGon goniometer and the EIGER X 16M detector, and development of new data collection methods. PMID:29271779
Weaves as an Interconnection Fabric for ASIM's and Nanosatellites
NASA Technical Reports Server (NTRS)
Gorlick, Michael M.
1995-01-01
Many of the micromachines under consideration require computer support, indeed, one of the appeals of this technology is the ability to intermix mechanical, optical, analog, and digital devices on the same substrate. The amount of computer power is rarely an issue, the sticking point is the complexity of the software required to make effective use of these devices. Micromachines are the nano-technologist's equivalent of 'golden screws'. In other words, they will be piece parts in larger assemblages. For example, a nano-satellite may be composed of stacked silicon wafers where each wafer contains hundreds to thousands of micromachines, digital controllers, general purpose computers, memories, and high-speed bus interconnects. Comparatively few of these devices will be custom designed, most will be stock parts selected from libraries and catalogs. The novelty will lie in the interconnections. For example, a digital accelerometer may be a component part in an adaptive suspension, a monitoring element embedded in the wrapper of a package, or a portion of the smart skin of a launch vehicle. In each case, this device must inter-operate with other devices and probes for the purposes of command, control, and communication. We propose a software technology called 'weaves' that will permit large collections of micromachines and their attendant computers to freely intercommunicate while preserving modularity, transparency, and flexibility. Weaves are composed of networks of communicating software components. The network, and the components comprising it, may be changed even while the software, and the devices it controls, are executing. This unusual degree of software plasticity permits micromachines to dynamically adapt the software to changing conditions and allows system engineers to rapidly and inexpensively develop special purpose software by assembling stock software components in custom configurations.
Robotics On-Board Trainer (ROBoT)
NASA Technical Reports Server (NTRS)
Johnson, Genevieve; Alexander, Greg
2013-01-01
ROBoT is an on-orbit version of the ground-based Dynamics Skills Trainer (DST) that astronauts use for training on a frequent basis. This software consists of two primary software groups. The first series of components is responsible for displaying the graphical scenes. The remaining components are responsible for simulating the Mobile Servicing System (MSS), the Japanese Experiment Module Remote Manipulator System (JEMRMS), and the H-II Transfer Vehicle (HTV) Free Flyer Robotics Operations. The MSS simulation software includes: Robotic Workstation (RWS) simulation, a simulation of the Space Station Remote Manipulator System (SSRMS), a simulation of the ISS Command and Control System (CCS), and a portion of the Portable Computer System (PCS) software necessary for MSS operations. These components all run under the CentOS4.5 Linux operating system. The JEMRMS simulation software includes real-time, HIL, dynamics, manipulator multi-body dynamics, and a moving object contact model with Tricks discrete time scheduling. The JEMRMS DST will be used as a functional proficiency and skills trainer for flight crews. The HTV Free Flyer Robotics Operations simulation software adds a functional simulation of HTV vehicle controllers, sensors, and data to the MSS simulation software. These components are intended to support HTV ISS visiting vehicle analysis and training. The scene generation software will use DOUG (Dynamic On-orbit Ubiquitous Graphics) to render the graphical scenes. DOUG runs on a laptop running the CentOS4.5 Linux operating system. DOUG is an Open GL-based 3D computer graphics rendering package. It uses pre-built three-dimensional models of on-orbit ISS and space shuttle systems elements, and provides realtime views of various station and shuttle configurations.
NASA Astrophysics Data System (ADS)
Yetman, G.; Downs, R. R.
2011-12-01
Software deployment is needed to process and distribute scientific data throughout the data lifecycle. Developing software in-house can take software development teams away from other software development projects and can require efforts to maintain the software over time. Adopting and reusing software and system modules that have been previously developed by others can reduce in-house software development and maintenance costs and can contribute to the quality of the system being developed. A variety of models are available for reusing and deploying software and systems that have been developed by others. These deployment models include open source software, vendor-supported open source software, commercial software, and combinations of these approaches. Deployment in Earth science data processing and distribution has demonstrated the advantages and drawbacks of each model. Deploying open source software offers advantages for developing and maintaining scientific data processing systems and applications. By joining an open source community that is developing a particular system module or application, a scientific data processing team can contribute to aspects of the software development without having to commit to developing the software alone. Communities of interested developers can share the work while focusing on activities that utilize in-house expertise and addresses internal requirements. Maintenance is also shared by members of the community. Deploying vendor-supported open source software offers similar advantages to open source software. However, by procuring the services of a vendor, the in-house team can rely on the vendor to provide, install, and maintain the software over time. Vendor-supported open source software may be ideal for teams that recognize the value of an open source software component or application and would like to contribute to the effort, but do not have the time or expertise to contribute extensively. Vendor-supported software may also have the additional benefits of guaranteed up-time, bug fixes, and vendor-added enhancements. Deploying commercial software can be advantageous for obtaining system or software components offered by a vendor that meet in-house requirements. The vendor can be contracted to provide installation, support and maintenance services as needed. Combining these options offers a menu of choices, enabling selection of system components or software modules that meet the evolving requirements encountered throughout the scientific data lifecycle.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-15
... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-752] Certain Gaming and Entertainment Consoles, Related Software, and Components Thereof; Notice of Request for Statements on the Public Interest AGENCY: U.S. International Trade Commission. ACTION: Notice. SUMMARY: Notice is hereby given that the...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-24
... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-720] In the Matter of Certain Biometric... accessing its Internet server at http://www.usitc.gov . The public record for this investigation may be... certain biometric scanning devices, components thereof, associated software, and products containing the...
NASA Technical Reports Server (NTRS)
Ensey, Tyler S.
2013-01-01
During my internship at NASA, I was a model developer for Ground Support Equipment (GSE). The purpose of a model developer is to develop and unit test model component libraries (fluid, electrical, gas, etc.). The models are designed to simulate software for GSE (Ground Special Power, Crew Access Arm, Cryo, Fire and Leak Detection System, Environmental Control System (ECS), etc. .) before they are implemented into hardware. These models support verifying local control and remote software for End-Item Software Under Test (SUT). The model simulates the physical behavior (function, state, limits and 110) of each end-item and it's dependencies as defined in the Subsystem Interface Table, Software Requirements & Design Specification (SRDS), Ground Integrated Schematic (GIS), and System Mechanical Schematic.(SMS). The software of each specific model component is simulated through MATLAB's Simulink program. The intensiv model development life cycle is a.s follows: Identify source documents; identify model scope; update schedule; preliminary design review; develop model requirements; update model.. scope; update schedule; detailed design review; create/modify library component; implement library components reference; implement subsystem components; develop a test script; run the test script; develop users guide; send model out for peer review; the model is sent out for verifictionlvalidation; if there is empirical data, a validation data package is generated; if there is not empirical data, a verification package is generated; the test results are then reviewed; and finally, the user. requests accreditation, and a statement of accreditation is prepared. Once each component model is reviewed and approved, they are intertwined together into one integrated model. This integrated model is then tested itself, through a test script and autotest, so that it can be concluded that all models work conjointly, for a single purpose. The component I was assigned, specifically, was a fluid component, a discrete pressure switch. The switch takes a fluid pressure input, and if the pressure is greater than a designated cutoff pressure, the switch would stop fluid flow.
GCS component development cycle
NASA Astrophysics Data System (ADS)
Rodríguez, Jose A.; Macias, Rosa; Molgo, Jordi; Guerra, Dailos; Pi, Marti
2012-09-01
The GTC1 is an optical-infrared 10-meter segmented mirror telescope at the ORM observatory in Canary Islands (Spain). First light was at 13/07/2007 and since them it is in the operation phase. The GTC control system (GCS) is a distributed object & component oriented system based on RT-CORBA8 and it is responsible for the management and operation of the telescope, including its instrumentation. GCS has used the Rational Unified process (RUP9) in its development. RUP is an iterative software development process framework. After analysing (use cases) and designing (UML10) any of GCS subsystems, an initial component description of its interface is obtained and from that information a component specification is written. In order to improve the code productivity, GCS has adopted the code generation to transform this component specification into the skeleton of component classes based on a software framework, called Device Component Framework. Using the GCS development tools, based on javadoc and gcc, in only one step, the component is generated, compiled and deployed to be tested for the first time through our GUI inspector. The main advantages of this approach are the following: It reduces the learning curve of new developers and the development error rate, allows a systematic use of design patterns in the development and software reuse, speeds up the deliverables of the software product and massively increase the timescale, design consistency and design quality, and eliminates the future refactoring process required for the code.
A Core Plug and Play Architecture for Reusable Flight Software Systems
NASA Technical Reports Server (NTRS)
Wilmot, Jonathan
2006-01-01
The Flight Software Branch, at Goddard Space Flight Center (GSFC), has been working on a run-time approach to facilitate a formal software reuse process. The reuse process is designed to enable rapid development and integration of high-quality software systems and to more accurately predict development costs and schedule. Previous reuse practices have been somewhat successful when the same teams are moved from project to project. But this typically requires taking the software system in an all-or-nothing approach where useful components cannot be easily extracted from the whole. As a result, the system is less flexible and scalable with limited applicability to new projects. This paper will focus on the rationale behind, and implementation of the run-time executive. This executive is the core for the component-based flight software commonality and reuse process adopted at Goddard.
A theoretical basis for the analysis of redundant software subject to coincident errors
NASA Technical Reports Server (NTRS)
Eckhardt, D. E., Jr.; Lee, L. D.
1985-01-01
Fundamental to the development of redundant software techniques fault-tolerant software, is an understanding of the impact of multiple-joint occurrences of coincident errors. A theoretical basis for the study of redundant software is developed which provides a probabilistic framework for empirically evaluating the effectiveness of the general (N-Version) strategy when component versions are subject to coincident errors, and permits an analytical study of the effects of these errors. The basic assumptions of the model are: (1) independently designed software components are chosen in a random sample; and (2) in the user environment, the system is required to execute on a stationary input series. The intensity of coincident errors, has a central role in the model. This function describes the propensity to introduce design faults in such a way that software components fail together when executing in the user environment. The model is used to give conditions under which an N-Version system is a better strategy for reducing system failure probability than relying on a single version of software. A condition which limits the effectiveness of a fault-tolerant strategy is studied, and it is posted whether system failure probability varies monotonically with increasing N or whether an optimal choice of N exists.
Deductive Glue Code Synthesis for Embedded Software Systems Based on Code Patterns
NASA Technical Reports Server (NTRS)
Liu, Jian; Fu, Jicheng; Zhang, Yansheng; Bastani, Farokh; Yen, I-Ling; Tai, Ann; Chau, Savio N.
2006-01-01
Automated code synthesis is a constructive process that can be used to generate programs from specifications. It can, thus, greatly reduce the software development cost and time. The use of formal code synthesis approach for software generation further increases the dependability of the system. Though code synthesis has many potential benefits, the synthesis techniques are still limited. Meanwhile, components are widely used in embedded system development. Applying code synthesis to component based software development (CBSD) process can greatly enhance the capability of code synthesis while reducing the component composition efforts. In this paper, we discuss the issues and techniques for applying deductive code synthesis techniques to CBSD. For deductive synthesis in CBSD, a rule base is the key for inferring appropriate component composition. We use the code patterns to guide the development of rules. Code patterns have been proposed to capture the typical usages of the components. Several general composition operations have been identified to facilitate systematic composition. We present the technique for rule development and automated generation of new patterns from existing code patterns. A case study of using this method in building a real-time control system is also presented.
Understanding software faults and their role in software reliability modeling
NASA Technical Reports Server (NTRS)
Munson, John C.
1994-01-01
This study is a direct result of an on-going project to model the reliability of a large real-time control avionics system. In previous modeling efforts with this system, hardware reliability models were applied in modeling the reliability behavior of this system. In an attempt to enhance the performance of the adapted reliability models, certain software attributes were introduced in these models to control for differences between programs and also sequential executions of the same program. As the basic nature of the software attributes that affect software reliability become better understood in the modeling process, this information begins to have important implications on the software development process. A significant problem arises when raw attribute measures are to be used in statistical models as predictors, for example, of measures of software quality. This is because many of the metrics are highly correlated. Consider the two attributes: lines of code, LOC, and number of program statements, Stmts. In this case, it is quite obvious that a program with a high value of LOC probably will also have a relatively high value of Stmts. In the case of low level languages, such as assembly language programs, there might be a one-to-one relationship between the statement count and the lines of code. When there is a complete absence of linear relationship among the metrics, they are said to be orthogonal or uncorrelated. Usually the lack of orthogonality is not serious enough to affect a statistical analysis. However, for the purposes of some statistical analysis such as multiple regression, the software metrics are so strongly interrelated that the regression results may be ambiguous and possibly even misleading. Typically, it is difficult to estimate the unique effects of individual software metrics in the regression equation. The estimated values of the coefficients are very sensitive to slight changes in the data and to the addition or deletion of variables in the regression equation. Since most of the existing metrics have common elements and are linear combinations of these common elements, it seems reasonable to investigate the structure of the underlying common factors or components that make up the raw metrics. The technique we have chosen to use to explore this structure is a procedure called principal components analysis. Principal components analysis is a decomposition technique that may be used to detect and analyze collinearity in software metrics. When confronted with a large number of metrics measuring a single construct, it may be desirable to represent the set by some smaller number of variables that convey all, or most, of the information in the original set. Principal components are linear transformations of a set of random variables that summarize the information contained in the variables. The transformations are chosen so that the first component accounts for the maximal amount of variation of the measures of any possible linear transform; the second component accounts for the maximal amount of residual variation; and so on. The principal components are constructed so that they represent transformed scores on dimensions that are orthogonal. Through the use of principal components analysis, it is possible to have a set of highly related software attributes mapped into a small number of uncorrelated attribute domains. This definitively solves the problem of multi-collinearity in subsequent regression analysis. There are many software metrics in the literature, but principal component analysis reveals that there are few distinct sources of variation, i.e. dimensions, in this set of metrics. It would appear perfectly reasonable to characterize the measurable attributes of a program with a simple function of a small number of orthogonal metrics each of which represents a distinct software attribute domain.
A Role-Playing Game for a Software Engineering Lab: Developing a Product Line
ERIC Educational Resources Information Center
Zuppiroli, Sara; Ciancarini, Paolo; Gabbrielli, Maurizio
2012-01-01
Software product line development refers to software engineering practices and techniques for creating families of similar software systems from a basic set of reusable components, called shared assets. Teaching how to deal with software product lines in a university lab course is a challenging task, because there are several practical issues that…
Application-Program-Installer Builder
NASA Technical Reports Server (NTRS)
Wolgast, Paul; Demore, Martha; Lowik, Paul
2007-01-01
A computer program builds application programming interfaces (APIs) and related software components for installing and uninstalling application programs in any of a variety of computers and operating systems that support the Java programming language in its binary form. This program is partly similar in function to commercial (e.g., Install-Shield) software. This program is intended to enable satisfaction of a quasi-industry-standard set of requirements for a set of APIs that would enable such installation and uninstallation and that would avoid the pitfalls that are commonly encountered during installation of software. The requirements include the following: 1) Properly detecting prerequisites to an application program before performing the installation; 2) Properly registering component requirements; 3) Correctly measuring the required hard-disk space, including accounting for prerequisite components that have already been installed; and 4) Correctly uninstalling an application program. Correct uninstallation includes (1) detecting whether any component of the program to be removed is required by another program, (2) not removing that component, and (3) deleting references to requirements of the to-be-removed program for components of other programs so that those components can be properly removed at a later time.
High performance VLSI telemetry data systems
NASA Technical Reports Server (NTRS)
Chesney, J.; Speciale, N.; Horner, W.; Sabia, S.
1990-01-01
NASA's deployment of major space complexes such as Space Station Freedom (SSF) and the Earth Observing System (EOS) will demand increased functionality and performance from ground based telemetry acquisition systems well above current system capabilities. Adaptation of space telemetry data transport and processing standards such as those specified by the Consultative Committee for Space Data Systems (CCSDS) standards and those required for commercial ground distribution of telemetry data, will drive these functional and performance requirements. In addition, budget limitations will force the requirement for higher modularity, flexibility, and interchangeability at lower cost in new ground telemetry data system elements. At NASA's Goddard Space Flight Center (GSFC), the design and development of generic ground telemetry data system elements, over the last five years, has resulted in significant solutions to these problems. This solution, referred to as the functional components approach includes both hardware and software components ready for end user application. The hardware functional components consist of modern data flow architectures utilizing Application Specific Integrated Circuits (ASIC's) developed specifically to support NASA's telemetry data systems needs and designed to meet a range of data rate requirements up to 300 Mbps. Real-time operating system software components support both embedded local software intelligence, and overall system control, status, processing, and interface requirements. These components, hardware and software, form the superstructure upon which project specific elements are added to complete a telemetry ground data system installation. This paper describes the functional components approach, some specific component examples, and a project example of the evolution from VLSI component, to basic board level functional component, to integrated telemetry data system.
NASA Technical Reports Server (NTRS)
Fletcher, Daryl P.; Alena, Richard L.; Akkawi, Faisal; Duncavage, Daniel P.
2004-01-01
This paper presents some of the challenges associated with bringing software projects from the research world into an operationa1 environment. While the core functional components of research-oriented software applications can have great utility in an operational setting, these applications often lack aspects important in an operational environment such as logging and security. Furthermore, these stand-alone applications, sometimes developed in isolation from one another, can produce data products useful to other applications in a software ecosystem.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boyer, W.B.
1979-09-01
This report describes both the hardware and software components of an automatic calibration and signal system (Autocal) for the data acquisition system for the Sandia particle beam fusion research accelerators Hydra, Proto I, and Proto II. The Autocal hardware consists of off-the-shelf commercial equipment. The various hardware components, special modifications and overall system configuration are described. Special software has been developed to support the Autocal hardware. Software operation and maintenance are described.
Digital echocardiography 2002: now is the time
NASA Technical Reports Server (NTRS)
Thomas, James D.; Greenberg, Neil L.; Garcia, Mario J.
2002-01-01
The ability to acquire echocardiographic images digitally, store and transfer these data using the DICOM standard, and routinely analyze examinations exists today and allows the implementation of a digital echocardiography laboratory. The purpose of this review article is to outline the critical components of a digital echocardiography laboratory, discuss general strategies for implementation, and put forth some of the pitfalls that we have encountered in our own implementation. The major components of the digital laboratory include (1) digital echocardiography machines with network output, (2) a switched high-speed network, (3) a high throughput server with abundant local storage, (4) a reliable low-cost archive, (5) software to manage information, and (6) support mechanisms for software and hardware. Implementation strategies can vary from a complete vendor solution providing all components (hardware, software, support), to a strategy similar to our own where standard computer and networking hardware are used with specialized software for management of image and measurement information.
Characterizing and Modeling the Cost of Rework in a Library of Reusable Software Components
NASA Technical Reports Server (NTRS)
Basili, Victor R.; Condon, Steven E.; ElEmam, Khaled; Hendrick, Robert B.; Melo, Walcelio
1997-01-01
In this paper we characterize and model the cost of rework in a Component Factory (CF) organization. A CF is responsible for developing and packaging reusable software components. Data was collected on corrective maintenance activities for the Generalized Support Software reuse asset library located at the Flight Dynamics Division of NASA's GSFC. We then constructed a predictive model of the cost of rework using the C4.5 system for generating a logical classification model. The predictor variables for the model are measures of internal software product attributes. The model demonstrates good prediction accuracy, and can be used by managers to allocate resources for corrective maintenance activities. Furthermore, we used the model to generate proscriptive coding guidelines to improve programming, practices so that the cost of rework can be reduced in the future. The general approach we have used is applicable to other environments.
The component-based architecture of the HELIOS medical software engineering environment.
Degoulet, P; Jean, F C; Engelmann, U; Meinzer, H P; Baud, R; Sandblad, B; Wigertz, O; Le Meur, R; Jagermann, C
1994-12-01
The constitution of highly integrated health information networks and the growth of multimedia technologies raise new challenges for the development of medical applications. We describe in this paper the general architecture of the HELIOS medical software engineering environment devoted to the development and maintenance of multimedia distributed medical applications. HELIOS is made of a set of software components, federated by a communication channel called the HELIOS Unification Bus. The HELIOS kernel includes three main components, the Analysis-Design and Environment, the Object Information System and the Interface Manager. HELIOS services consist in a collection of toolkits providing the necessary facilities to medical application developers. They include Image Related services, a Natural Language Processor, a Decision Support System and Connection services. The project gives special attention to both object-oriented approaches and software re-usability that are considered crucial steps towards the development of more reliable, coherent and integrated applications.
Translator for Optimizing Fluid-Handling Components
NASA Technical Reports Server (NTRS)
Landon, Mark; Perry, Ernest
2007-01-01
A software interface has been devised to facilitate optimization of the shapes of valves, elbows, fittings, and other components used to handle fluids under extreme conditions. This software interface translates data files generated by PLOT3D (a NASA grid-based plotting-and- data-display program) and by computational fluid dynamics (CFD) software into a format in which the files can be read by Sculptor, which is a shape-deformation- and-optimization program. Sculptor enables the user to interactively, smoothly, and arbitrarily deform the surfaces and volumes in two- and three-dimensional CFD models. Sculptor also includes design-optimization algorithms that can be used in conjunction with the arbitrary-shape-deformation components to perform automatic shape optimization. In the optimization process, the output of the CFD software is used as feedback while the optimizer strives to satisfy design criteria that could include, for example, improved values of pressure loss, velocity, flow quality, mass flow, etc.
Using component technology to facilitate external software reuse in ground-based planning systems
NASA Technical Reports Server (NTRS)
Chase, A.
2003-01-01
APGEN (Activity Plan GENerator - 314), a multi-mission planning tool, must interface with external software to vest serve its users. AP-GEN's original method for incorporating external software, the User-Defined library mechanism, has been very successful in allowing APGEN users access to external software functionality.
Sahraneshin Samani, Fazel; Moore, Jodene K; Khosravani, Pardis; Ebrahimi, Marzieh
2014-08-01
Flow cytometers designed to analyze large particles are enabling new applications in biology. Data analysis is a critical component of the process FCM. In this article we compare features of four free software packages including WinMDI, Cyflogic, Flowing software, and Cytobank.
A Linguistic Model in Component Oriented Programming
NASA Astrophysics Data System (ADS)
Crăciunean, Daniel Cristian; Crăciunean, Vasile
2016-12-01
It is a fact that the component-oriented programming, well organized, can bring a large increase in efficiency in the development of large software systems. This paper proposes a model for building software systems by assembling components that can operate independently of each other. The model is based on a computing environment that runs parallel and distributed applications. This paper introduces concepts as: abstract aggregation scheme and aggregation application. Basically, an aggregation application is an application that is obtained by combining corresponding components. In our model an aggregation application is a word in a language.
Lichtenberg, M; Stahlhoff, W; Boese, D
2013-08-01
Single center observational study analyzing the primary patency rate and freedom from target lesions revascularization rate of the Pulsar-18 nitinol stent after recanalization of long superficial femoral artery (SFA) occlusions (TASC D) in 22 patients with critical limb ischemia (CLI). Between 1/2011 and 7/2011, 22 consecutive patients (9 male, 13 female) with chronic total occlusions (CTO) of the femoro-popliteal arteries presenting with CLI (17 patients with Rutherford 4 score, and 5 patients with Rutherford 5 score) were enrolled and successfully recanalized using the Pulsar-18 self-expanding (SE) nitinol stent (BIOTRONIK AG, Buelach, Switzerland). Primary patency at 12 months was defined as no binary restenosis (>50%) on Duplex ultrasound (PSVR<2.5) and respectively no target lesion revascularization performed within 12 months. The average lesion length of the treated femoro-popliteal segment was 315 mm. Performing spot stenting average stent length in all patients was 245 mm (minimal 215 mm, maximal 315 mm). Technical success, with establishing an antegrade straight line flow to the foot through a reopened SFA, was achieved in all 22 patients. Subintimal and intraluminal recanalization techniques were used. Two patients with Rutherford 5 score had a minor amputation shortly after the recanalization procedure. All other patients had a complete wound healing of their lesions during a 6 month follow-up. After 12 month follow-up the primary patency rate of the Pulsar-18 SE nitinol stent was 77% with a per protocol restenosis in 5 of 22 patients. Seventeen patients showed a walking capacity on treadmill test >300 meters (Rutherford II). Two patients with a documented restenosis were Rutherford, these patients were treated conservatively. Three patients with restenosis and a Rutherford III score were scheduled for an endovascular target lesion revascularization leading to a freedom from target lesion revascularization rate of 86%. Endovascular intervention of long SFA occlusions using subintimal or intraluminal recanalization technique with implantation of the Pulsar-18 SE nitinol stent in CLI patients is safe and clinically effective with a primary patency rate after 12 months of 77% and a freedom from target lesion revascularization rate of 86%.
NASA Astrophysics Data System (ADS)
Sangiorgi, Pierluca; Capalbi, Milvia; Gimenes, Renato; La Rosa, Giovanni; Russo, Francesco; Segreto, Alberto; Sottile, Giuseppe; Catalano, Osvaldo
2016-07-01
The purpose of this contribution is to present the current status of the software architecture of the ASTRI SST-2M Cherenkov Camera. The ASTRI SST-2M telescope is an end-to-end prototype for the Small Size Telescope of the Cherenkov Telescope Array. The ASTRI camera is an innovative instrument based on SiPM detectors and has several internal hardware components. In this contribution we will give a brief description of the hardware components of the camera of the ASTRI SST-2M prototype and of their interconnections. Then we will present the outcome of the software architectural design process that we carried out in order to identify the main structural components of the camera software system and the relationships among them. We will analyze the architectural model that describes how the camera software is organized as a set of communicating blocks. Finally, we will show where these blocks are deployed in the hardware components and how they interact. We will describe in some detail, the physical communication ports and external ancillary devices management, the high precision time-tag management, the fast data collection and the fast data exchange between different camera subsystems, and the interfacing with the external systems.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-07
... Certain Mobile Devices, Associated Software, and Components Thereof, DN 2757; the Commission is soliciting... into the United States, the sale for importation, and the sale within the United States after importation of certain mobile devices, associated software, and components thereof. The complaint names as...
The large scale microelectronics Computer-Aided Design and Test (CADAT) system
NASA Technical Reports Server (NTRS)
Gould, J. M.
1978-01-01
The CADAT system consists of a number of computer programs written in FORTRAN that provide the capability to simulate, lay out, analyze, and create the artwork for large scale microelectronics. The function of each software component of the system is described with references to specific documentation for each software component.
NASA Astrophysics Data System (ADS)
Fraser, Ryan; Gross, Lutz; Wyborn, Lesley; Evans, Ben; Klump, Jens
2015-04-01
Recent investments in HPC, cloud and Petascale data stores, have dramatically increased the scale and resolution that earth science challenges can now be tackled. These new infrastructures are highly parallelised and to fully utilise them and access the large volumes of earth science data now available, a new approach to software stack engineering needs to be developed. The size, complexity and cost of the new infrastructures mean any software deployed has to be reliable, trusted and reusable. Increasingly software is available via open source repositories, but these usually only enable code to be discovered and downloaded. As a user it is hard for a scientist to judge the suitability and quality of individual codes: rarely is there information on how and where codes can be run, what the critical dependencies are, and in particular, on the version requirements and licensing of the underlying software stack. A trusted software framework is proposed to enable reliable software to be discovered, accessed and then deployed on multiple hardware environments. More specifically, this framework will enable those who generate the software, and those who fund the development of software, to gain credit for the effort, IP, time and dollars spent, and facilitate quantification of the impact of individual codes. For scientific users, the framework delivers reviewed and benchmarked scientific software with mechanisms to reproduce results. The trusted framework will have five separate, but connected components: Register, Review, Reference, Run, and Repeat. 1) The Register component will facilitate discovery of relevant software from multiple open source code repositories. The registration process of the code should include information about licensing, hardware environments it can be run on, define appropriate validation (testing) procedures and list the critical dependencies. 2) The Review component is targeting on the verification of the software typically against a set of benchmark cases. This will be achieved by linking the code in the software framework to peer review forums such as Mozilla Science or appropriate Journals (e.g. Geoscientific Model Development Journal) to assist users to know which codes to trust. 3) Referencing will be accomplished by linking the Software Framework to groups such as Figshare or ImpactStory that help disseminate and measure the impact of scientific research, including program code. 4) The Run component will draw on information supplied in the registration process, benchmark cases described in the review and relevant information to instantiate the scientific code on the selected environment. 5) The Repeat component will tap into existing Provenance Workflow engines that will automatically capture information that relate to a particular run of that software, including identification of all input and output artefacts, and all elements and transactions within that workflow. The proposed trusted software framework will enable users to rapidly discover and access reliable code, reduce the time to deploy it and greatly facilitate sharing, reuse and reinstallation of code. Properly designed it could enable an ability to scale out to massively parallel systems and be accessed nationally/ internationally for multiple use cases, including Supercomputer centres, cloud facilities, and local computers.
Moody, George B; Mark, Roger G; Goldberger, Ary L
2011-01-01
PhysioNet provides free web access to over 50 collections of recorded physiologic signals and time series, and related open-source software, in support of basic, clinical, and applied research in medicine, physiology, public health, biomedical engineering and computing, and medical instrument design and evaluation. Its three components (PhysioBank, the archive of signals; PhysioToolkit, the software library; and PhysioNetWorks, the virtual laboratory for collaborative development of future PhysioBank data collections and PhysioToolkit software components) connect researchers and students who need physiologic signals and relevant software with researchers who have data and software to share. PhysioNet's annual open engineering challenges stimulate rapid progress on unsolved or poorly solved questions of basic or clinical interest, by focusing attention on achievable solutions that can be evaluated and compared objectively using freely available reference data.
Python as a federation tool for GENESIS 3.0.
Cornelis, Hugo; Rodriguez, Armando L; Coop, Allan D; Bower, James M
2012-01-01
The GENESIS simulation platform was one of the first broad-scale modeling systems in computational biology to encourage modelers to develop and share model features and components. Supported by a large developer community, it participated in innovative simulator technologies such as benchmarking, parallelization, and declarative model specification and was the first neural simulator to define bindings for the Python scripting language. An important feature of the latest version of GENESIS is that it decomposes into self-contained software components complying with the Computational Biology Initiative federated software architecture. This architecture allows separate scripting bindings to be defined for different necessary components of the simulator, e.g., the mathematical solvers and graphical user interface. Python is a scripting language that provides rich sets of freely available open source libraries. With clean dynamic object-oriented designs, they produce highly readable code and are widely employed in specialized areas of software component integration. We employ a simplified wrapper and interface generator to examine an application programming interface and make it available to a given scripting language. This allows independent software components to be 'glued' together and connected to external libraries and applications from user-defined Python or Perl scripts. We illustrate our approach with three examples of Python scripting. (1) Generate and run a simple single-compartment model neuron connected to a stand-alone mathematical solver. (2) Interface a mathematical solver with GENESIS 3.0 to explore a neuron morphology from either an interactive command-line or graphical user interface. (3) Apply scripting bindings to connect the GENESIS 3.0 simulator to external graphical libraries and an open source three dimensional content creation suite that supports visualization of models based on electron microscopy and their conversion to computational models. Employed in this way, the stand-alone software components of the GENESIS 3.0 simulator provide a framework for progressive federated software development in computational neuroscience.
Python as a Federation Tool for GENESIS 3.0
Cornelis, Hugo; Rodriguez, Armando L.; Coop, Allan D.; Bower, James M.
2012-01-01
The GENESIS simulation platform was one of the first broad-scale modeling systems in computational biology to encourage modelers to develop and share model features and components. Supported by a large developer community, it participated in innovative simulator technologies such as benchmarking, parallelization, and declarative model specification and was the first neural simulator to define bindings for the Python scripting language. An important feature of the latest version of GENESIS is that it decomposes into self-contained software components complying with the Computational Biology Initiative federated software architecture. This architecture allows separate scripting bindings to be defined for different necessary components of the simulator, e.g., the mathematical solvers and graphical user interface. Python is a scripting language that provides rich sets of freely available open source libraries. With clean dynamic object-oriented designs, they produce highly readable code and are widely employed in specialized areas of software component integration. We employ a simplified wrapper and interface generator to examine an application programming interface and make it available to a given scripting language. This allows independent software components to be ‘glued’ together and connected to external libraries and applications from user-defined Python or Perl scripts. We illustrate our approach with three examples of Python scripting. (1) Generate and run a simple single-compartment model neuron connected to a stand-alone mathematical solver. (2) Interface a mathematical solver with GENESIS 3.0 to explore a neuron morphology from either an interactive command-line or graphical user interface. (3) Apply scripting bindings to connect the GENESIS 3.0 simulator to external graphical libraries and an open source three dimensional content creation suite that supports visualization of models based on electron microscopy and their conversion to computational models. Employed in this way, the stand-alone software components of the GENESIS 3.0 simulator provide a framework for progressive federated software development in computational neuroscience. PMID:22276101
NASA Technical Reports Server (NTRS)
Pearson, Don; Hamm, Dustin; Kubena, Brian; Weaver, Jonathan K.
2010-01-01
An updated version of the Platform Independent Software Components for the Exploration of Space (PISCES) software library is available. A previous version was reported in Library for Developing Spacecraft-Mission-Planning Software (MSC-22983), NASA Tech Briefs, Vol. 25, No. 7 (July 2001), page 52. To recapitulate: This software provides for Web-based, collaborative development of computer programs for planning trajectories and trajectory- related aspects of spacecraft-mission design. The library was built using state-of-the-art object-oriented concepts and software-development methodologies. The components of PISCES include Java-language application programs arranged in a hierarchy of classes that facilitates the reuse of the components. As its full name suggests, the PISCES library affords platform-independence: The Java language makes it possible to use the classes and application programs with a Java virtual machine, which is available in most Web-browser programs. Another advantage is expandability: Object orientation facilitates expansion of the library through creation of a new class. Improvements in the library since the previous version include development of orbital-maneuver- planning and rendezvous-launch-window application programs, enhancement of capabilities for propagation of orbits, and development of a desktop user interface.
Engine structures analysis software: Component Specific Modeling (COSMO)
NASA Astrophysics Data System (ADS)
McKnight, R. L.; Maffeo, R. J.; Schwartz, S.
1994-08-01
A component specific modeling software program has been developed for propulsion systems. This expert program is capable of formulating the component geometry as finite element meshes for structural analysis which, in the future, can be spun off as NURB geometry for manufacturing. COSMO currently has geometry recipes for combustors, turbine blades, vanes, and disks. Component geometry recipes for nozzles, inlets, frames, shafts, and ducts are being added. COSMO uses component recipes that work through neutral files with the Technology Benefit Estimator (T/BEST) program which provides the necessary base parameters and loadings. This report contains the users manual for combustors, turbine blades, vanes, and disks.
Engine Structures Analysis Software: Component Specific Modeling (COSMO)
NASA Technical Reports Server (NTRS)
Mcknight, R. L.; Maffeo, R. J.; Schwartz, S.
1994-01-01
A component specific modeling software program has been developed for propulsion systems. This expert program is capable of formulating the component geometry as finite element meshes for structural analysis which, in the future, can be spun off as NURB geometry for manufacturing. COSMO currently has geometry recipes for combustors, turbine blades, vanes, and disks. Component geometry recipes for nozzles, inlets, frames, shafts, and ducts are being added. COSMO uses component recipes that work through neutral files with the Technology Benefit Estimator (T/BEST) program which provides the necessary base parameters and loadings. This report contains the users manual for combustors, turbine blades, vanes, and disks.
2010-01-01
offshoring, or producing major software components overseas (Defense Science Board, 2009). These trends raise concerns about the level of trust that...7 Software Complexity...7 Increasing Software Vulnerabilities and Malware Population . . . . . . . . . . . . . . . . 9 Limitations of
Framework for End-User Programming of Cross-Smart Space Applications
Palviainen, Marko; Kuusijärvi, Jarkko; Ovaska, Eila
2012-01-01
Cross-smart space applications are specific types of software services that enable users to share information, monitor the physical and logical surroundings and control it in a way that is meaningful for the user's situation. For developing cross-smart space applications, this paper makes two main contributions: it introduces (i) a component design and scripting method for end-user programming of cross-smart space applications and (ii) a backend framework of components that interwork to support the brunt of the RDFScript translation, and the use and execution of ontology models. Before end-user programming activities, the software professionals must develop easy-to-apply Driver components for the APIs of existing software systems. Thereafter, end-users are able to create applications from the commands of the Driver components with the help of the provided toolset. The paper also introduces the reference implementation of the framework, tools for the Driver component development and end-user programming of cross-smart space applications and the first evaluation results on their application. PMID:23202169
Estimating Software Effort Hours for Major Defense Acquisition Programs
ERIC Educational Resources Information Center
Wallshein, Corinne C.
2010-01-01
Software Cost Estimation (SCE) uses labor hours or effort required to conceptualize, develop, integrate, test, field, or maintain program components. Department of Defense (DoD) SCE can use initial software data parameters to project effort hours for large, software-intensive programs for contractors reporting the top levels of process maturity,…
Knowledge base methodology: Methodology for first Engineering Script Language (ESL) knowledge base
NASA Technical Reports Server (NTRS)
Peeris, Kumar; Izygon, Michel E.
1992-01-01
The primary goal of reusing software components is that software can be developed faster, cheaper and with higher quality. Though, reuse is not automatic and can not just happen. It has to be carefully engineered. For example a component needs to be easily understandable in order to be reused, and it has also to be malleable enough to fit into different applications. In fact the software development process is deeply affected when reuse is being applied. During component development, a serious effort has to be directed toward making these components as reusable. This implies defining reuse coding style guidelines and applying then to any new component to create as well as to any old component to modify. These guidelines should point out the favorable reuse features and may apply to naming conventions, module size and cohesion, internal documentation, etc. During application development, effort is shifted from writing new code toward finding and eventually modifying existing pieces of code, then assembling them together. We see here that reuse is not free, and therefore has to be carefully managed.
NASA Technical Reports Server (NTRS)
Basili, V. R.; Zelkowitz, M. V.
1978-01-01
In a brief evaluation of software-related considerations, it is found that suitable approaches for software development depend to a large degree on the characteristics of the particular project involved. An analysis is conducted of development problems in an environment in which ground support software is produced for spacecraft control. The amount of work involved is in the range from 6 to 10 man-years. Attention is given to a general project summary, a programmer/analyst survey, a component summary, a component status report, a resource summary, a change report, a computer program run analysis, aspects of data collection on a smaller scale, progress forecasting, problems of overhead, and error analysis.
General-Purpose Front End for Real-Time Data Processing
NASA Technical Reports Server (NTRS)
James, Mark
2007-01-01
FRONTIER is a computer program that functions as a front end for any of a variety of other software of both the artificial intelligence (AI) and conventional data-processing types. As used here, front end signifies interface software needed for acquiring and preprocessing data and making the data available for analysis by the other software. FRONTIER is reusable in that it can be rapidly tailored to any such other software with minimum effort. Each component of FRONTIER is programmable and is executed in an embedded virtual machine. Each component can be reconfigured during execution. The virtual-machine implementation making FRONTIER independent of the type of computing hardware on which it is executed.
NASA Astrophysics Data System (ADS)
Hart, D. M.; Merchant, B. J.; Abbott, R. E.
2012-12-01
The Component Evaluation project at Sandia National Laboratories supports the Ground-based Nuclear Explosion Monitoring program by performing testing and evaluation of the components that are used in seismic and infrasound monitoring systems. In order to perform this work, Component Evaluation maintains a testing facility called the FACT (Facility for Acceptance, Calibration, and Testing) site, a variety of test bed equipment, and a suite of software tools for analyzing test data. Recently, Component Evaluation has successfully integrated several improvements to its software analysis tools and test bed equipment that have substantially improved our ability to test and evaluate components. The software tool that is used to analyze test data is called TALENT: Test and AnaLysis EvaluatioN Tool. TALENT is designed to be a single, standard interface to all test configuration, metadata, parameters, waveforms, and results that are generated in the course of testing monitoring systems. It provides traceability by capturing everything about a test in a relational database that is required to reproduce the results of that test. TALENT provides a simple, yet powerful, user interface to quickly acquire, process, and analyze waveform test data. The software tool has also been expanded recently to handle sensors whose output is proportional to rotation angle, or rotation rate. As an example of this new processing capability, we show results from testing the new ATA ARS-16 rotational seismometer. The test data was collected at the USGS ASL. Four datasets were processed: 1) 1 Hz with increasing amplitude, 2) 4 Hz with increasing amplitude, 3) 16 Hz with increasing amplitude and 4) twenty-six discrete frequencies between 0.353 Hz to 64 Hz. The results are compared to manufacture-supplied data sheets.
A federated design for a neurobiological simulation engine: the CBI federated software architecture.
Cornelis, Hugo; Coop, Allan D; Bower, James M
2012-01-01
Simulator interoperability and extensibility has become a growing requirement in computational biology. To address this, we have developed a federated software architecture. It is federated by its union of independent disparate systems under a single cohesive view, provides interoperability through its capability to communicate, execute programs, or transfer data among different independent applications, and supports extensibility by enabling simulator expansion or enhancement without the need for major changes to system infrastructure. Historically, simulator interoperability has relied on development of declarative markup languages such as the neuron modeling language NeuroML, while simulator extension typically occurred through modification of existing functionality. The software architecture we describe here allows for both these approaches. However, it is designed to support alternative paradigms of interoperability and extensibility through the provision of logical relationships and defined application programming interfaces. They allow any appropriately configured component or software application to be incorporated into a simulator. The architecture defines independent functional modules that run stand-alone. They are arranged in logical layers that naturally correspond to the occurrence of high-level data (biological concepts) versus low-level data (numerical values) and distinguish data from control functions. The modular nature of the architecture and its independence from a given technology facilitates communication about similar concepts and functions for both users and developers. It provides several advantages for multiple independent contributions to software development. Importantly, these include: (1) Reduction in complexity of individual simulator components when compared to the complexity of a complete simulator, (2) Documentation of individual components in terms of their inputs and outputs, (3) Easy removal or replacement of unnecessary or obsoleted components, (4) Stand-alone testing of components, and (5) Clear delineation of the development scope of new components.
A Federated Design for a Neurobiological Simulation Engine: The CBI Federated Software Architecture
Cornelis, Hugo; Coop, Allan D.; Bower, James M.
2012-01-01
Simulator interoperability and extensibility has become a growing requirement in computational biology. To address this, we have developed a federated software architecture. It is federated by its union of independent disparate systems under a single cohesive view, provides interoperability through its capability to communicate, execute programs, or transfer data among different independent applications, and supports extensibility by enabling simulator expansion or enhancement without the need for major changes to system infrastructure. Historically, simulator interoperability has relied on development of declarative markup languages such as the neuron modeling language NeuroML, while simulator extension typically occurred through modification of existing functionality. The software architecture we describe here allows for both these approaches. However, it is designed to support alternative paradigms of interoperability and extensibility through the provision of logical relationships and defined application programming interfaces. They allow any appropriately configured component or software application to be incorporated into a simulator. The architecture defines independent functional modules that run stand-alone. They are arranged in logical layers that naturally correspond to the occurrence of high-level data (biological concepts) versus low-level data (numerical values) and distinguish data from control functions. The modular nature of the architecture and its independence from a given technology facilitates communication about similar concepts and functions for both users and developers. It provides several advantages for multiple independent contributions to software development. Importantly, these include: (1) Reduction in complexity of individual simulator components when compared to the complexity of a complete simulator, (2) Documentation of individual components in terms of their inputs and outputs, (3) Easy removal or replacement of unnecessary or obsoleted components, (4) Stand-alone testing of components, and (5) Clear delineation of the development scope of new components. PMID:22242154
Use of NMR and NMR Prediction Software to Identify Components in Red Bull Energy Drinks
ERIC Educational Resources Information Center
Simpson, Andre J.; Shirzadi, Azadeh; Burrow, Timothy E.; Dicks, Andrew P.; Lefebvre, Brent; Corrin, Tricia
2009-01-01
A laboratory experiment designed as part of an upper-level undergraduate analytical chemistry course is described. Students investigate two popular soft drinks (Red Bull Energy Drink and sugar-free Red Bull Energy Drink) by NMR spectroscopy. With assistance of modern NMR prediction software they identify and quantify major components in each…
Software For Graphical Representation Of A Network
NASA Technical Reports Server (NTRS)
Mcallister, R. William; Mclellan, James P.
1993-01-01
System Visualization Tool (SVT) computer program developed to provide systems engineers with means of graphically representing networks. Generates diagrams illustrating structures and states of networks defined by users. Provides systems engineers powerful tool simplifing analysis of requirements and testing and maintenance of complex software-controlled systems. Employs visual models supporting analysis of chronological sequences of requirements, simulation data, and related software functions. Applied to pneumatic, hydraulic, and propellant-distribution networks. Used to define and view arbitrary configurations of such major hardware components of system as propellant tanks, valves, propellant lines, and engines. Also graphically displays status of each component. Advantage of SVT: utilizes visual cues to represent configuration of each component within network. Written in Turbo Pascal(R), version 5.0.
EMMA: a new paradigm in configurable software
Nogiec, J. M.; Trombly-Freytag, K.
2017-11-23
EMMA is a framework designed to create a family of configurable software systems, with emphasis on extensibility and flexibility. It is based on a loosely coupled, event driven architecture. The EMMA framework has been built upon the premise of composing software systems from independent components. It opens up opportunities for reuse of components and their functionality and composing them together in many different ways. As a result, it provides the developer of test and measurement applications with a lightweight alternative to microservices, while sharing their various advantages, including composability, loose coupling, encapsulation, and reuse.
EMMA: A New Paradigm in Configurable Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nogiec, J. M.; Trombly-Freytag, K.
EMMA is a framework designed to create a family of configurable software systems, with emphasis on extensibility and flexibility. It is based on a loosely coupled, event driven architecture. The EMMA framework has been built upon the premise of composing software systems from independent components. It opens up opportunities for reuse of components and their functionality and composing them together in many different ways. It provides the developer of test and measurement applications with a lightweight alternative to microservices, while sharing their various advantages, including composability, loose coupling, encapsulation, and reuse.
EMMA: a new paradigm in configurable software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nogiec, J. M.; Trombly-Freytag, K.
EMMA is a framework designed to create a family of configurable software systems, with emphasis on extensibility and flexibility. It is based on a loosely coupled, event driven architecture. The EMMA framework has been built upon the premise of composing software systems from independent components. It opens up opportunities for reuse of components and their functionality and composing them together in many different ways. As a result, it provides the developer of test and measurement applications with a lightweight alternative to microservices, while sharing their various advantages, including composability, loose coupling, encapsulation, and reuse.
EMMA: a new paradigm in configurable software
NASA Astrophysics Data System (ADS)
Nogiec, J. M.; Trombly-Freytag, K.
2017-10-01
EMMA is a framework designed to create a family of configurable software systems, with emphasis on extensibility and flexibility. It is based on a loosely coupled, event driven architecture. The EMMA framework has been built upon the premise of composing software systems from independent components. It opens up opportunities for reuse of components and their functionality and composing them together in many different ways. It provides the developer of test and measurement applications with a lightweight alternative to microservices, while sharing their various advantages, including composability, loose coupling, encapsulation, and reuse.
Software for integrated manufacturing systems, part 2
NASA Technical Reports Server (NTRS)
Volz, R. A.; Naylor, A. W.
1987-01-01
Part 1 presented an overview of the unified approach to manufacturing software. The specific characteristics of the approach that allow it to realize the goals of reduced cost, increased reliability and increased flexibility are considered. Why the blending of a components view, distributed languages, generics and formal models is important, why each individual part of this approach is essential, and why each component will typically have each of these parts are examined. An example of a specification for a real material handling system is presented using the approach and compared with the standard interface specification given by the manufacturer. Use of the component in a distributed manufacturing system is then compared with use of the traditional specification with a more traditional approach to designing the system. An overview is also provided of the underlying mechanisms used for implementing distributed manufacturing systems using the unified software/hardware component approach.
Certified Binaries for Software Components
2007-09-01
is sponsored by the U.S. Department of Defense. The Software Engineering Institute is a federally funded research and development center sponsored...by the U.S. Department of Defense. Copyright 2007 Carnegie Mellon University. NO WARRANTY THIS CARNEGIE MELLON UNIVERSITY AND SOFTWARE ENGINEERING
The Architecture Based Design Method
2000-01-01
implementation of components of different types. The software templates include a description of how components interact with shared services and also include citizenship responsibilities for components.
NASGRO(registered trademark): Fracture Mechanics and Fatigue Crack Growth Analysis Software
NASA Technical Reports Server (NTRS)
Forman, Royce; Shivakumar, V.; Mettu, Sambi; Beek, Joachim; Williams, Leonard; Yeh, Feng; McClung, Craig; Cardinal, Joe
2004-01-01
This viewgraph presentation describes NASGRO, which is a fracture mechanics and fatigue crack growth analysis software package that is used to reduce risk of fracture in Space Shuttles. The contents include: 1) Consequences of Fracture; 2) NASA Fracture Control Requirements; 3) NASGRO Reduces Risk; 4) NASGRO Use Inside NASA; 5) NASGRO Components: Crack Growth Module; 6) NASGRO Components:Material Property Module; 7) Typical NASGRO analysis: Crack growth or component life calculation; and 8) NASGRO Sample Application: Orbiter feedline flowliner crack analysis.
Software components for medical image visualization and surgical planning
NASA Astrophysics Data System (ADS)
Starreveld, Yves P.; Gobbi, David G.; Finnis, Kirk; Peters, Terence M.
2001-05-01
Purpose: The development of new applications in medical image visualization and surgical planning requires the completion of many common tasks such as image reading and re-sampling, segmentation, volume rendering, and surface display. Intra-operative use requires an interface to a tracking system and image registration, and the application requires basic, easy to understand user interface components. Rapid changes in computer and end-application hardware, as well as in operating systems and network environments make it desirable to have a hardware and operating system as an independent collection of reusable software components that can be assembled rapidly to prototype new applications. Methods: Using the OpenGL based Visualization Toolkit as a base, we have developed a set of components that implement the above mentioned tasks. The components are written in both C++ and Python, but all are accessible from Python, a byte compiled scripting language. The components have been used on the Red Hat Linux, Silicon Graphics Iris, Microsoft Windows, and Apple OS X platforms. Rigorous object-oriented software design methods have been applied to ensure hardware independence and a standard application programming interface (API). There are components to acquire, display, and register images from MRI, MRA, CT, Computed Rotational Angiography (CRA), Digital Subtraction Angiography (DSA), 2D and 3D ultrasound, video and physiological recordings. Interfaces to various tracking systems for intra-operative use have also been implemented. Results: The described components have been implemented and tested. To date they have been used to create image manipulation and viewing tools, a deep brain functional atlas, a 3D ultrasound acquisition and display platform, a prototype minimally invasive robotic coronary artery bypass graft planning system, a tracked neuro-endoscope guidance system and a frame-based stereotaxy neurosurgery planning tool. The frame-based stereotaxy module has been licensed and certified for use in a commercial image guidance system. Conclusions: It is feasible to encapsulate image manipulation and surgical guidance tasks in individual, reusable software modules. These modules allow for faster development of new applications. The strict application of object oriented software design methods allows individual components of such a system to make the transition from the research environment to a commercial one.
The Evolution of Software Publication in Astronomy
NASA Astrophysics Data System (ADS)
Cantiello, Matteo
2018-01-01
Software is a fundamental component of the scientific research process. As astronomical discoveries increasingly rely on complex numerical calculations and the analysis of big data sets, publishing and documenting software is a fundamental step in ensuring transparency and reproducibility of results. I will briefly discuss the recent history of software publication and highlight the challenges and opportunities ahead.
The Chandra X-ray Center data system: supporting the mission of the Chandra X-ray Observatory
NASA Astrophysics Data System (ADS)
Evans, Janet D.; Cresitello-Dittmar, Mark; Doe, Stephen; Evans, Ian; Fabbiano, Giuseppina; Germain, Gregg; Glotfelty, Kenny; Hall, Diane; Plummer, David; Zografou, Panagoula
2006-06-01
The Chandra X-ray Center Data System provides end-to-end scientific software support for Chandra X-ray Observatory mission operations. The data system includes the following components: (1) observers' science proposal planning tools; (2) science mission planning tools; (3) science data processing, monitoring, and trending pipelines and tools; and (4) data archive and database management. A subset of the science data processing component is ported to multiple platforms and distributed to end-users as a portable data analysis package. Web-based user tools are also available for data archive search and retrieval. We describe the overall architecture of the data system and its component pieces, and consider the design choices and their impacts on maintainability. We discuss the many challenges involved in maintaining a large, mission-critical software system with limited resources. These challenges include managing continually changing software requirements and ensuring the integrity of the data system and resulting data products while being highly responsive to the needs of the project. We describe our use of COTS and OTS software at the subsystem and component levels, our methods for managing multiple release builds, and adapting a large code base to new hardware and software platforms. We review our experiences during the life of the mission so-far, and our approaches for keeping a small, but highly talented, development team engaged during the maintenance phase of a mission.
Component-Based Visualization System
NASA Technical Reports Server (NTRS)
Delgado, Francisco
2005-01-01
A software system has been developed that gives engineers and operations personnel with no "formal" programming expertise, but who are familiar with the Microsoft Windows operating system, the ability to create visualization displays to monitor the health and performance of aircraft/spacecraft. This software system is currently supporting the X38 V201 spacecraft component/system testing and is intended to give users the ability to create, test, deploy, and certify their subsystem displays in a fraction of the time that it would take to do so using previous software and programming methods. Within the visualization system there are three major components: the developer, the deployer, and the widget set. The developer is a blank canvas with widget menu items that give users the ability to easily create displays. The deployer is an application that allows for the deployment of the displays created using the developer application. The deployer has additional functionality that the developer does not have, such as printing of displays, screen captures to files, windowing of displays, and also serves as the interface into the documentation archive and help system. The third major component is the widget set. The widgets are the visual representation of the items that will make up the display (i.e., meters, dials, buttons, numerical indicators, string indicators, and the like). This software was developed using Visual C++ and uses COTS (commercial off-the-shelf) software where possible.
The Effect of AOP on Software Engineering, with Particular Attention to OIF and Event Quantification
NASA Technical Reports Server (NTRS)
Havelund, Klaus; Filman, Robert; Korsmeyer, David (Technical Monitor)
2003-01-01
We consider the impact of Aspect-Oriented Programming on Software Engineering, and, in particular, analyze two AOP systems, one of which does component wrapping and the other, quantification over events, for their software engineering effects.
Code of Federal Regulations, 2011 CFR
2011-10-01
... electrical, mechanical, hardware, or software) that is part of a system or subsystem. Configuration..., including the hardware components and software version, is documented and maintained through the life-cycle... or compensates individuals to perform the duties specified in § 236.921 (a). Executive software means...
Code of Federal Regulations, 2014 CFR
2014-10-01
... electrical, mechanical, hardware, or software) that is part of a system or subsystem. Configuration..., including the hardware components and software version, is documented and maintained through the life-cycle... or compensates individuals to perform the duties specified in § 236.921 (a). Executive software means...
Code of Federal Regulations, 2012 CFR
2012-10-01
... electrical, mechanical, hardware, or software) that is part of a system or subsystem. Configuration..., including the hardware components and software version, is documented and maintained through the life-cycle... or compensates individuals to perform the duties specified in § 236.921 (a). Executive software means...
Code of Federal Regulations, 2013 CFR
2013-10-01
... electrical, mechanical, hardware, or software) that is part of a system or subsystem. Configuration..., including the hardware components and software version, is documented and maintained through the life-cycle... or compensates individuals to perform the duties specified in § 236.921 (a). Executive software means...
CrossTalk: The Journal of Defense Software Engineering. Volume 21, Number 1
2008-01-01
project manage- ment and the individual components of the software life-cycle model ; it will be awarded for...software professionals that had been formally educated in software project manage- ment. The study indicated that our industry is lacking in program managers...soft- ware developments get bigger, more complicated, and more dependent on senior software pro- fessionals to get the project on the right path
Issues in Defining Software Architectures in a GIS Environment
NASA Technical Reports Server (NTRS)
Acosta, Jesus; Alvorado, Lori
1997-01-01
The primary mission of the Pan-American Center for Earth and Environmental Studies (PACES) is to advance the research areas that are relevant to NASA's Mission to Planet Earth program. One of the activities at PACES is the establishment of a repository for geographical, geological and environmental information that covers various regions of Mexico and the southwest region of the U.S. and that is acquired from NASA and other sources through remote sensing, ground studies or paper-based maps. The center will be providing access of this information to other government entities in the U.S. and Mexico, and research groups from universities, national laboratories and industry. Geographical Information Systems(GIS) provide the means to manage, manipulate, analyze and display geographically referenced information that will be managed by PACES. Excellent off-the-shelf software exists for a complete GIS as well as software for storing and managing spatial databases, processing images, networking and viewing maps with layered information. This allows the user flexibility in combining systems to create a GIS or to mix these software packages with custom-built application programs. Software architectural languages provide the ability to specify the computational components and interactions among these components, an important topic in the domain of GIS because of the need to integrate numerous software packages. This paper discusses the characteristics that architectural languages address with respect to the issues relating to the data that must be communicated between software systems and components when systems interact. The paper presents a background on GIS in section 2. Section 3 gives an overview of software architecture and architectural languages. Section 4 suggests issues that may be of concern when defining the software architecture of a GIS. The last section discusses the future research effort and finishes with a summary.
Butterfly valve in a virtual environment
NASA Astrophysics Data System (ADS)
Talekar, Aniruddha; Patil, Saurabh; Thakre, Prashant; Rajkumar, E.
2017-11-01
Assembly of components is one of the processes involved in product design and development. The present paper deals with the assembly of a simple butterfly valve components in a virtual environment. The assembly has been carried out using virtual reality software by trial and error methods. The parts are modelled using parametric software (SolidWorks), meshed accordingly, and then called into virtual environment for assembly.
Computational Simulations and the Scientific Method
NASA Technical Reports Server (NTRS)
Kleb, Bil; Wood, Bill
2005-01-01
As scientific simulation software becomes more complicated, the scientific-software implementor's need for component tests from new model developers becomes more crucial. The community's ability to follow the basic premise of the Scientific Method requires independently repeatable experiments, and model innovators are in the best position to create these test fixtures. Scientific software developers also need to quickly judge the value of the new model, i.e., its cost-to-benefit ratio in terms of gains provided by the new model and implementation risks such as cost, time, and quality. This paper asks two questions. The first is whether other scientific software developers would find published component tests useful, and the second is whether model innovators think publishing test fixtures is a feasible approach.
Architecture of a framework for providing information services for public transport.
García, Carmelo R; Pérez, Ricardo; Lorenzo, Alvaro; Quesada-Arencibia, Alexis; Alayón, Francisco; Padrón, Gabino
2012-01-01
This paper presents OnRoute, a framework for developing and running ubiquitous software that provides information services to passengers of public transportation, including payment systems and on-route guidance services. To achieve a high level of interoperability, accessibility and context awareness, OnRoute uses the ubiquitous computing paradigm. To guarantee the quality of the software produced, the reliable software principles used in critical contexts, such as automotive systems, are also considered by the framework. The main components of its architecture (run-time, system services, software components and development discipline) and how they are deployed in the transportation network (stations and vehicles) are described in this paper. Finally, to illustrate the use of OnRoute, the development of a guidance service for travellers is explained.
What Not To Do: Anti-patterns for Developing Scientific Workflow Software Components
NASA Astrophysics Data System (ADS)
Futrelle, J.; Maffei, A. R.; Sosik, H. M.; Gallager, S. M.; York, A.
2013-12-01
Scientific workflows promise to enable efficient scaling-up of researcher code to handle large datasets and workloads, as well as documentation of scientific processing via standardized provenance records, etc. Workflow systems and related frameworks for coordinating the execution of otherwise separate components are limited, however, in their ability to overcome software engineering design problems commonly encountered in pre-existing components, such as scripts developed externally by scientists in their laboratories. In practice, this often means that components must be rewritten or replaced in a time-consuming, expensive process. In the course of an extensive workflow development project involving large-scale oceanographic image processing, we have begun to identify and codify 'anti-patterns'--problematic design characteristics of software--that make components fit poorly into complex automated workflows. We have gone on to develop and document low-effort solutions and best practices that efficiently address the anti-patterns we have identified. The issues, solutions, and best practices can be used to evaluate and improve existing code, as well as guiding the development of new components. For example, we have identified a common anti-pattern we call 'batch-itis' in which a script fails and then cannot perform more work, even if that work is not precluded by the failure. The solution we have identified--removing unnecessary looping over independent units of work--is often easier to code than the anti-pattern, as it eliminates the need for complex control flow logic in the component. Other anti-patterns we have identified are similarly easy to identify and often easy to fix. We have drawn upon experience working with three science teams at Woods Hole Oceanographic Institution, each of which has designed novel imaging instruments and associated image analysis code. By developing use cases and prototypes within these teams, we have undertaken formal evaluations of software components developed by programmers with widely varying levels of expertise, and have been able to discover and characterize a number of anti-patterns. Our evaluation methodology and testbed have also enabled us to assess the efficacy of strategies to address these anti-patterns according to scientifically relevant metrics, such as ability of algorithms to perform faster than the rate of data acquisition and the accuracy of workflow component output relative to ground truth. The set of anti-patterns and solutions we have identified augments of the body of more well-known software engineering anti-patterns by addressing additional concerns that obtain when a software component has to function as part of a workflow assembled out of independently-developed codebases. Our experience shows that identifying and resolving these anti-patterns reduces development time and improves performance without reducing component reusability.
SAGA: A project to automate the management of software production systems
NASA Technical Reports Server (NTRS)
Campbell, R. H.; Badger, W.; Beckman, C. S.; Beshers, G.; Hammerslag, D.; Kimball, J.; Kirslis, P. A.; Render, H.; Richards, P.; Terwilliger, R.
1984-01-01
The project to automate the management of software production systems is described. The SAGA system is a software environment that is designed to support most of the software development activities that occur in a software lifecycle. The system can be configured to support specific software development applications using given programming languages, tools, and methodologies. Meta-tools are provided to ease configuration. Several major components of the SAGA system are completed to prototype form. The construction methods are described.
1986-09-01
point here Is that the capital cost of design and development (including the cost of software tools and/or CAD/CAM programs which aided in the development...and capitalization , software Is in many ways more Ike a hardware component than it is Ike the tech- nical documentation which supports the hardware...Invoked, the owner of intelectual property rights in software may attach appropriate copyright notices to software delivered under this contract. 2.2.2
45 CFR 307.5 - Mandatory computerized support enforcement systems.
Code of Federal Regulations, 2013 CFR
2013-10-01
... hardware, operational system software, and electronic linkages with the separate components of an... plans to use and how they will interface with the base system; (3) Provide documentation that the... and for operating costs including hardware, operational software and applications software of a...
45 CFR 307.5 - Mandatory computerized support enforcement systems.
Code of Federal Regulations, 2014 CFR
2014-10-01
... hardware, operational system software, and electronic linkages with the separate components of an... plans to use and how they will interface with the base system; (3) Provide documentation that the... and for operating costs including hardware, operational software and applications software of a...
45 CFR 307.5 - Mandatory computerized support enforcement systems.
Code of Federal Regulations, 2012 CFR
2012-10-01
... hardware, operational system software, and electronic linkages with the separate components of an... plans to use and how they will interface with the base system; (3) Provide documentation that the... and for operating costs including hardware, operational software and applications software of a...
Bar-Code System for a Microbiological Laboratory
NASA Technical Reports Server (NTRS)
Law, Jennifer; Kirschner, Larry
2007-01-01
A bar-code system has been assembled for a microbiological laboratory that must examine a large number of samples. The system includes a commercial bar-code reader, computer hardware and software components, plus custom-designed database software. The software generates a user-friendly, menu-driven interface.
NASA Technical Reports Server (NTRS)
Wallace, Robert
1986-01-01
A major impediment to a systematic attack on Ada software reusability is the lack of an effective taxonomy for software component functions. The scope of all possible applications of Ada software is considered too great to allow the practical development of a working taxonomy. Instead, for the purposes herein, the scope of Ada software application is limited to device and subsystem control in real-time embedded systems. A functional approach is taken in constructing the taxonomy tree for identified Ada domain. The use of modular software functions as a starting point fits well with the object oriented programming philosophy of Ada. Examples of the types of functions represented within the working taxonomy are real time kernels, interrupt service routines, synchronization and message passing, data conversion, digital filtering and signal conditioning, and device control. The constructed taxonomy is proposed as a framework from which a need analysis can be performed to reveal voids in current Ada real-time embedded programming efforts for Space Station.
Large scale database scrubbing using object oriented software components.
Herting, R L; Barnes, M R
1998-01-01
Now that case managers, quality improvement teams, and researchers use medical databases extensively, the ability to share and disseminate such databases while maintaining patient confidentiality is paramount. A process called scrubbing addresses this problem by removing personally identifying information while keeping the integrity of the medical information intact. Scrubbing entire databases, containing multiple tables, requires that the implicit relationships between data elements in different tables of the database be maintained. To address this issue we developed DBScrub, a Java program that interfaces with any JDBC compliant database and scrubs the database while maintaining the implicit relationships within it. DBScrub uses a small number of highly configurable object-oriented software components to carry out the scrubbing. We describe the structure of these software components and how they maintain the implicit relationships within the database.
NASA Astrophysics Data System (ADS)
Tamura, Yoshinobu; Yamada, Shigeru
OSS (open source software) systems which serve as key components of critical infrastructures in our social life are still ever-expanding now. Especially, embedded OSS systems have been gaining a lot of attention in the embedded system area, i.e., Android, BusyBox, TRON, etc. However, the poor handling of quality problem and customer support prohibit the progress of embedded OSS. Also, it is difficult for developers to assess the reliability and portability of embedded OSS on a single-board computer. In this paper, we propose a method of software reliability assessment based on flexible hazard rates for the embedded OSS. Also, we analyze actual data of software failure-occurrence time-intervals to show numerical examples of software reliability assessment for the embedded OSS. Moreover, we compare the proposed hazard rate model for the embedded OSS with the typical conventional hazard rate models by using the comparison criteria of goodness-of-fit. Furthermore, we discuss the optimal software release problem for the porting-phase based on the total expected software maintenance cost.
A Prototype for the Support of Integrated Software Process Development and Improvement
NASA Astrophysics Data System (ADS)
Porrawatpreyakorn, Nalinpat; Quirchmayr, Gerald; Chutimaskul, Wichian
An efficient software development process is one of key success factors for quality software. Not only can the appropriate establishment but also the continuous improvement of integrated project management and of the software development process result in efficiency. This paper hence proposes a software process maintenance framework which consists of two core components: an integrated PMBOK-Scrum model describing how to establish a comprehensive set of project management and software engineering processes and a software development maturity model advocating software process improvement. Besides, a prototype tool to support the framework is introduced.
Parameterized hardware description as object oriented hardware model implementation
NASA Astrophysics Data System (ADS)
Drabik, Pawel K.
2010-09-01
The paper introduces novel model for design, visualization and management of complex, highly adaptive hardware systems. The model settles component oriented environment for both hardware modules and software application. It is developed on parameterized hardware description research. Establishment of stable link between hardware and software, as a purpose of designed and realized work, is presented. Novel programming framework model for the environment, named Graphic-Functional-Components is presented. The purpose of the paper is to present object oriented hardware modeling with mentioned features. Possible model implementation in FPGA chips and its management by object oriented software in Java is described.
NASA Technical Reports Server (NTRS)
Wilber, George F.
2017-01-01
This Software Description Document (SDD) captures the design for developing the Flight Interval Management (FIM) system Configurable Graphics Display (CGD) software. Specifically this SDD describes aspects of the Boeing CGD software and the surrounding context and interfaces. It does not describe the Honeywell components of the CGD system. The SDD provides the system overview, architectural design, and detailed design with all the necessary information to implement the Boeing components of the CGD software and integrate them into the CGD subsystem within the larger FIM system. Overall system and CGD system-level requirements are derived from the CGD SRS (in turn derived from the Boeing System Requirements Design Document (SRDD)). Display and look-and-feel requirements are derived from Human Machine Interface (HMI) design documents and working group recommendations. This Boeing CGD SDD is required to support the upcoming Critical Design Review (CDR).
Design of Control Software for a High-Speed Coherent Doppler Lidar System for CO2 Measurement
NASA Technical Reports Server (NTRS)
Vanvalkenburg, Randal L.; Beyon, Jeffrey Y.; Koch, Grady J.; Yu, Jirong; Singh, Upendra N.; Kavaya, Michael J.
2010-01-01
The design of the software for a 2-micron coherent high-speed Doppler lidar system for CO2 measurement at NASA Langley Research Center is discussed in this paper. The specific strategy and design topology to meet the requirements of the system are reviewed. In order to attain the high-speed digitization of the different types of signals to be sampled on multiple channels, a carefully planned design of the control software is imperative. Samples of digitized data from each channel and their roles in data analysis post processing are also presented. Several challenges of extremely-fast, high volume data acquisition are discussed. The software must check the validity of each lidar return as well as other monitoring channel data in real-time. For such high-speed data acquisition systems, the software is a key component that enables the entire scope of CO2 measurement studies using commercially available system components.
A theoretical basis for the analysis of multiversion software subject to coincident errors
NASA Technical Reports Server (NTRS)
Eckhardt, D. E., Jr.; Lee, L. D.
1985-01-01
Fundamental to the development of redundant software techniques (known as fault-tolerant software) is an understanding of the impact of multiple joint occurrences of errors, referred to here as coincident errors. A theoretical basis for the study of redundant software is developed which: (1) provides a probabilistic framework for empirically evaluating the effectiveness of a general multiversion strategy when component versions are subject to coincident errors, and (2) permits an analytical study of the effects of these errors. An intensity function, called the intensity of coincident errors, has a central role in this analysis. This function describes the propensity of programmers to introduce design faults in such a way that software components fail together when executing in the application environment. A condition under which a multiversion system is a better strategy than relying on a single version is given.
Feature-based component model for design of embedded systems
NASA Astrophysics Data System (ADS)
Zha, Xuan Fang; Sriram, Ram D.
2004-11-01
An embedded system is a hybrid of hardware and software, which combines software's flexibility and hardware real-time performance. Embedded systems can be considered as assemblies of hardware and software components. An Open Embedded System Model (OESM) is currently being developed at NIST to provide a standard representation and exchange protocol for embedded systems and system-level design, simulation, and testing information. This paper proposes an approach to representing an embedded system feature-based model in OESM, i.e., Open Embedded System Feature Model (OESFM), addressing models of embedded system artifacts, embedded system components, embedded system features, and embedded system configuration/assembly. The approach provides an object-oriented UML (Unified Modeling Language) representation for the embedded system feature model and defines an extension to the NIST Core Product Model. The model provides a feature-based component framework allowing the designer to develop a virtual embedded system prototype through assembling virtual components. The framework not only provides a formal precise model of the embedded system prototype but also offers the possibility of designing variation of prototypes whose members are derived by changing certain virtual components with different features. A case study example is discussed to illustrate the embedded system model.
NASA Technical Reports Server (NTRS)
Jovic, Srboljub
2015-01-01
This document provides the software design description for the two core software components, the LVC Gateway, the LVC Gateway Toolbox, and two participants, the LVC Gateway Data Logger and the SAA Processor (SaaProc).
ERIC Educational Resources Information Center
Wade, Erin; Boon, Richard T.; Spencer, Vicky G.
2010-01-01
The aim of this research brief was to explore the efficacy of story mapping, with the integration of Kidspiration[C] software, to enhance the reading comprehension skills of story grammar components for elementary-age students. Three students served as the participants, two in third grade and one in fourth, with specific learning disabilities…
[Computer-assisted management of depots for blood products in health establishments].
Carré, J
2008-11-01
To manage the filing of blood components at the hospital of the city of Bayeux, the laboratory uses Cursus, a dedicated software for haemovigilance. Benefits for using this software at different steps of the blood bank management are: simplification, security and harmonization of practices during receipt and issurance of blood components, securing recordings with the use of bar codes for patient identification and blood components listing, implementation of a computerized tracking system for transfusion, traceability, limitation of written documents and availability of statistics on the management of the depot.
Austin, Peter C
2010-04-22
Multilevel logistic regression models are increasingly being used to analyze clustered data in medical, public health, epidemiological, and educational research. Procedures for estimating the parameters of such models are available in many statistical software packages. There is currently little evidence on the minimum number of clusters necessary to reliably fit multilevel regression models. We conducted a Monte Carlo study to compare the performance of different statistical software procedures for estimating multilevel logistic regression models when the number of clusters was low. We examined procedures available in BUGS, HLM, R, SAS, and Stata. We found that there were qualitative differences in the performance of different software procedures for estimating multilevel logistic models when the number of clusters was low. Among the likelihood-based procedures, estimation methods based on adaptive Gauss-Hermite approximations to the likelihood (glmer in R and xtlogit in Stata) or adaptive Gaussian quadrature (Proc NLMIXED in SAS) tended to have superior performance for estimating variance components when the number of clusters was small, compared to software procedures based on penalized quasi-likelihood. However, only Bayesian estimation with BUGS allowed for accurate estimation of variance components when there were fewer than 10 clusters. For all statistical software procedures, estimation of variance components tended to be poor when there were only five subjects per cluster, regardless of the number of clusters.
Reuse Metrics for Object Oriented Software
NASA Technical Reports Server (NTRS)
Bieman, James M.
1998-01-01
One way to increase the quality of software products and the productivity of software development is to reuse existing software components when building new software systems. In order to monitor improvements in reuse, the level of reuse must be measured. In this NASA supported project we (1) derived a suite of metrics which quantify reuse attributes for object oriented, object based, and procedural software, (2) designed prototype tools to take these measurements in Ada, C++, Java, and C software, (3) evaluated the reuse in available software, (4) analyzed the relationship between coupling, cohesion, inheritance, and reuse, (5) collected object oriented software systems for our empirical analyses, and (6) developed quantitative criteria and methods for restructuring software to improve reusability.
NASA Tech Briefs, December 1997. Volume 21, No. 12
NASA Technical Reports Server (NTRS)
1997-01-01
Topics: Design and Analysis Software; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Software; Mechanics; Manufacturing/Fabrication; Mathematics and Information Sciences; Books and Reports.
The Software Design Document: More than a User's Manual.
ERIC Educational Resources Information Center
Bowers, Dennis
1989-01-01
Discusses the value of creating design documentation for computer software so that it may serve as a model for similar design efforts. Components of the software design document are described, including program flowcharts, graphic representation of screen displays, storyboards, and evaluation procedures. An example is given using HyperCard. (three…
Multidisciplinary and Active/Collaborative Approaches in Teaching Requirements Engineering
ERIC Educational Resources Information Center
Rosca, Daniela
2005-01-01
The requirements engineering course is a core component of the curriculum for the Master's in Software Engineering programme, at Monmouth University (MU). It covers the process, methods and tools specific to this area, together with the corresponding software quality issues. The need to produce software engineers with strong teamwork and…
15 CFR Supplement No. 6 to Part 742 - Guidelines for Submitting Review Requests for Encryption Items
Code of Federal Regulations, 2010 CFR
2010-01-01
... brochures or other documentation or specifications related to the technology, commodity or software... commodity or software, provide the following information: (1) Description of all the symmetric and... is provided by third-party hardware or software encryption components (if any). Identify the...
Models and Frameworks: A Synergistic Association for Developing Component-Based Applications
Sánchez-Ledesma, Francisco; Sánchez, Pedro; Pastor, Juan A.; Álvarez, Bárbara
2014-01-01
The use of frameworks and components has been shown to be effective in improving software productivity and quality. However, the results in terms of reuse and standardization show a dearth of portability either of designs or of component-based implementations. This paper, which is based on the model driven software development paradigm, presents an approach that separates the description of component-based applications from their possible implementations for different platforms. This separation is supported by automatic integration of the code obtained from the input models into frameworks implemented using object-oriented technology. Thus, the approach combines the benefits of modeling applications from a higher level of abstraction than objects, with the higher levels of code reuse provided by frameworks. In order to illustrate the benefits of the proposed approach, two representative case studies that use both an existing framework and an ad hoc framework, are described. Finally, our approach is compared with other alternatives in terms of the cost of software development. PMID:25147858
Models and frameworks: a synergistic association for developing component-based applications.
Alonso, Diego; Sánchez-Ledesma, Francisco; Sánchez, Pedro; Pastor, Juan A; Álvarez, Bárbara
2014-01-01
The use of frameworks and components has been shown to be effective in improving software productivity and quality. However, the results in terms of reuse and standardization show a dearth of portability either of designs or of component-based implementations. This paper, which is based on the model driven software development paradigm, presents an approach that separates the description of component-based applications from their possible implementations for different platforms. This separation is supported by automatic integration of the code obtained from the input models into frameworks implemented using object-oriented technology. Thus, the approach combines the benefits of modeling applications from a higher level of abstraction than objects, with the higher levels of code reuse provided by frameworks. In order to illustrate the benefits of the proposed approach, two representative case studies that use both an existing framework and an ad hoc framework, are described. Finally, our approach is compared with other alternatives in terms of the cost of software development.
Web accessibility and open source software.
Obrenović, Zeljko
2009-07-01
A Web browser provides a uniform user interface to different types of information. Making this interface universally accessible and more interactive is a long-term goal still far from being achieved. Universally accessible browsers require novel interaction modalities and additional functionalities, for which existing browsers tend to provide only partial solutions. Although functionality for Web accessibility can be found as open source and free software components, their reuse and integration is complex because they were developed in diverse implementation environments, following standards and conventions incompatible with the Web. To address these problems, we have started several activities that aim at exploiting the potential of open-source software for Web accessibility. The first of these activities is the development of Adaptable Multi-Interface COmmunicator (AMICO):WEB, an infrastructure that facilitates efficient reuse and integration of open source software components into the Web environment. The main contribution of AMICO:WEB is in enabling the syntactic and semantic interoperability between Web extension mechanisms and a variety of integration mechanisms used by open source and free software components. Its design is based on our experiences in solving practical problems where we have used open source components to improve accessibility of rich media Web applications. The second of our activities involves improving education, where we have used our platform to teach students how to build advanced accessibility solutions from diverse open-source software. We are also partially involved in the recently started Eclipse projects called Accessibility Tools Framework (ACTF), the aim of which is development of extensible infrastructure, upon which developers can build a variety of utilities that help to evaluate and enhance the accessibility of applications and content for people with disabilities. In this article we briefly report on these activities.
The Software Management Environment (SME)
NASA Technical Reports Server (NTRS)
Valett, Jon D.; Decker, William; Buell, John
1988-01-01
The Software Management Environment (SME) is a research effort designed to utilize the past experiences and results of the Software Engineering Laboratory (SEL) and to incorporate this knowledge into a tool for managing projects. SME provides the software development manager with the ability to observe, compare, predict, analyze, and control key software development parameters such as effort, reliability, and resource utilization. The major components of the SME, the architecture of the system, and examples of the functionality of the tool are discussed.
Airland Battlefield Environment (ALBE) Tactical Decision Aid (TDA) Demonstration Program,
1987-11-12
Management System (DBMS) software, GKS graphics libraries, and user interface software. These components of the ATB system software architecture will be... knowlede base ano auqent the decision mak:n• process by providing infocr-mation useful in the formulation and execution of battlefield strategies...Topographic Laboratories as an Engineer. Ms. Capps is managing the software development of the AirLand Battlefield Environment (ALBE) geographic
A Review of Feature Extraction Software for Microarray Gene Expression Data
Tan, Ching Siang; Ting, Wai Soon; Mohamad, Mohd Saberi; Chan, Weng Howe; Deris, Safaai; Ali Shah, Zuraini
2014-01-01
When gene expression data are too large to be processed, they are transformed into a reduced representation set of genes. Transforming large-scale gene expression data into a set of genes is called feature extraction. If the genes extracted are carefully chosen, this gene set can extract the relevant information from the large-scale gene expression data, allowing further analysis by using this reduced representation instead of the full size data. In this paper, we review numerous software applications that can be used for feature extraction. The software reviewed is mainly for Principal Component Analysis (PCA), Independent Component Analysis (ICA), Partial Least Squares (PLS), and Local Linear Embedding (LLE). A summary and sources of the software are provided in the last section for each feature extraction method. PMID:25250315
Simulation Control Graphical User Interface Logging Report
NASA Technical Reports Server (NTRS)
Hewling, Karl B., Jr.
2012-01-01
One of the many tasks of my project was to revise the code of the Simulation Control Graphical User Interface (SIM GUI) to enable logging functionality to a file. I was also tasked with developing a script that directed the startup and initialization flow of the various LCS software components. This makes sure that a software component will not spin up until all the appropriate dependencies have been configured properly. Also I was able to assist hardware modelers in verifying the configuration of models after they have been upgraded to a new software version. I developed some code that analyzes the MDL files to determine if any error were generated due to the upgrade process. Another one of the projects assigned to me was supporting the End-to-End Hardware/Software Daily Tag-up meeting.
Architecture of a Framework for Providing Information Services for Public Transport
García, Carmelo R.; Pérez, Ricardo; Lorenzo, Álvaro; Quesada-Arencibia, Alexis; Alayón, Francisco; Padrón, Gabino
2012-01-01
This paper presents OnRoute, a framework for developing and running ubiquitous software that provides information services to passengers of public transportation, including payment systems and on-route guidance services. To achieve a high level of interoperability, accessibility and context awareness, OnRoute uses the ubiquitous computing paradigm. To guarantee the quality of the software produced, the reliable software principles used in critical contexts, such as automotive systems, are also considered by the framework. The main components of its architecture (run-time, system services, software components and development discipline) and how they are deployed in the transportation network (stations and vehicles) are described in this paper. Finally, to illustrate the use of OnRoute, the development of a guidance service for travellers is explained. PMID:22778585
Domain specific software architectures: Command and control
NASA Technical Reports Server (NTRS)
Braun, Christine; Hatch, William; Ruegsegger, Theodore; Balzer, Bob; Feather, Martin; Goldman, Neil; Wile, Dave
1992-01-01
GTE is the Command and Control contractor for the Domain Specific Software Architectures program. The objective of this program is to develop and demonstrate an architecture-driven, component-based capability for the automated generation of command and control (C2) applications. Such a capability will significantly reduce the cost of C2 applications development and will lead to improved system quality and reliability through the use of proven architectures and components. A major focus of GTE's approach is the automated generation of application components in particular subdomains. Our initial work in this area has concentrated in the message handling subdomain; we have defined and prototyped an approach that can automate one of the most software-intensive parts of C2 systems development. This paper provides an overview of the GTE team's DSSA approach and then presents our work on automated support for message processing.
Is Atherectomy the Best First-Line Therapy for Limb Salvage in Patients With Critical Limb Ischemia?
Loor, Gabriel; Skelly, Christopher L.; Wahlgren, Carl-Magnus; Bassiouny, Hisham S.; Piano, Giancarlo; Shaalan, Wael
2010-01-01
Objective To determine the efficacy of atherectomy for limb salvage compared with open bypass in patients with critical limb ischemia. Methods Ninety-nine consecutive bypass and atherectomy procedures performed for critical limb ischemia between January 2003 and October 2006 were reviewed. Results A total of 99 cases involving TASC C (n = 43, 44%) and D (n = 56, 56%) lesions were treated with surgical bypass in 59 patients and atherectomy in 33 patients. Bypass and atherectomy achieved similar 1-year primary patency (64% vs 63%; P = .2). However, the 1-year limb salvage rate was greater in the bypass group (87% vs 69%; P = .004). In the tissue loss subgroup, there was a greater limb salvage rate for bypass patients versus atherectomy (79% vs 60%; P = .04). Conclusions Patients with critical limb ischemia may do better with open bypass compared with atherectomy as first-line therapy for limb salvage. PMID:19640919
Percutaneous Femoropopliteal Recanalization Using a Completely Transpedal/Transtibial Approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clark, Timothy W.I., E-mail: timothy.clark@uphs.upenn.edu; Watts, Micah M.; Kwan, Tak W.
PurposeTo report preliminary experience with femoropopliteal revascularization using a completely transpedal/transtibial approach.Materials and MethodsThree patients with Rutherford 3–4 disease underwent revascularization of TASC C/D lesions using a pedal/tibial artery as the only site of arterial access.ResultsOne patient with a chronic superficial femoral artery occlusion had continuity achieved to the common femoral artery using a dedicated reentry device and stenting; in a second patient, an occluded popliteal artery stent was successfully revised with an endograft; and in a third patient with morbid obesity, a chronic SFA occlusion was successfully stented. All patients experienced complete resolution of presenting symptoms; no puncture sitemore » complications were seen.ConclusionsUse of a pedal/tibial approach as the sole site of arterial access may become an important access technique for femoropopliteal revascularization when patients have limited femoral access options.« less
Navy Technology Transfer Program FY 77 Summary Statistics.
1978-01-01
ATN.CORATOR GRANT NUmBERC.j 9 0’E R _M’NC.CORGANIZATION N AME AND A’-,DRESS ID PROOR;AM EL EME ’r. 1 ROJECT, TASC AREA &-ARKUNI- NU)MBERS Headquarters...Cro 4- L. > 0)90 W >-0 EL . W 0 -r- CE )W EL - 09 , c - Z U0 C 4- -U~ z OL- 4 C: c-CC0 c-4 OWCD() Li- W’i ) C -- a) cC 0) < cmf U) oU)xO0E-c 0 C41- WV...CC’MM-. DE .’L O.C M, 3Cu C> m -0 CCD I 0 u U .LfV C HO rc C .L V- 0-J ’’ >0CE-- (0))0L) - 0 ((00 L(- *-V - 0 0 -0 C C LO -o (0v . -0 0’--o CO C(A
The Software Architecture of Global Climate Models
NASA Astrophysics Data System (ADS)
Alexander, K. A.; Easterbrook, S. M.
2011-12-01
It has become common to compare and contrast the output of multiple global climate models (GCMs), such as in the Climate Model Intercomparison Project Phase 5 (CMIP5). However, intercomparisons of the software architecture of GCMs are almost nonexistent. In this qualitative study of seven GCMs from Canada, the United States, and Europe, we attempt to fill this gap in research. We describe the various representations of the climate system as computer programs, and account for architectural differences between models. Most GCMs now practice component-based software engineering, where Earth system components (such as the atmosphere or land surface) are present as highly encapsulated sub-models. This architecture facilitates a mix-and-match approach to climate modelling that allows for convenient sharing of model components between institutions, but it also leads to difficulty when choosing where to draw the lines between systems that are not encapsulated in the real world, such as sea ice. We also examine different styles of couplers in GCMs, which manage interaction and data flow between components. Finally, we pay particular attention to the varying levels of complexity in GCMs, both between and within models. Many GCMs have some components that are significantly more complex than others, a phenomenon which can be explained by the respective institution's research goals as well as the origin of the model components. In conclusion, although some features of software architecture have been adopted by every GCM we examined, other features show a wide range of different design choices and strategies. These architectural differences may provide new insights into variability and spread between models.
Aquarius' Object-Oriented, Plug and Play Component-Based Flight Software
NASA Technical Reports Server (NTRS)
Murray, Alexander; Shahabuddin, Mohammad
2013-01-01
The Aquarius mission involves a combined radiometer and radar instrument in low-Earth orbit, providing monthly global maps of Sea Surface Salinity. Operating successfully in orbit since June, 2011, the spacecraft bus was furnished by the Argentine space agency, Comision Nacional de Actividades Espaciales (CONAE). The instrument, built jointly by NASA's Caltech/JPL and Goddard Space Flight Center, has been successfully producing expectation-exceeding data since it was powered on in August of 2011. In addition to the radiometer and scatterometer, the instrument contains an command & data-handling subsystem with a computer and flight software (FSW) that is responsible for managing the instrument, its operation, and its data. Aquarius' FSW is conceived and architected as a Component-based system, in which the running software consists of a set of Components, each playing a distinctive role in the subsystem, instantiated and connected together at runtime. Component architectures feature a well-defined set of interfaces between the Components, visible and analyzable at the architectural level (see [1]). As we will describe, this kind of an architecture offers significant advantages over more traditional FSW architectures, which often feature a monolithic runtime structure. Component-based software is enabled by Object-Oriented (OO) techniques and languages, the use of which again is not typical in space mission FSW. We will argue in this paper that the use of OO design methods and tools (especially the Unified Modeling Language), as well as the judicious usage of C++, are very well suited to FSW applications, and we will present Aquarius FSW, describing our methods, processes, and design, as a successful case in point.
21 CFR 862.2570 - Instrumentation for clinical multiplex test systems.
Code of Federal Regulations, 2010 CFR
2010-04-01
... HUMAN SERVICES (CONTINUED) MEDICAL DEVICES CLINICAL CHEMISTRY AND CLINICAL TOXICOLOGY DEVICES Clinical... hardware components, as well as raw data storage mechanisms, data acquisition software, and software to...
Test Driven Development of a Parameterized Ice Sheet Component
NASA Astrophysics Data System (ADS)
Clune, T.
2011-12-01
Test driven development (TDD) is a software development methodology that offers many advantages over traditional approaches including reduced development and maintenance costs, improved reliability, and superior design quality. Although TDD is widely accepted in many software communities, the suitability to scientific software is largely undemonstrated and warrants a degree of skepticism. Indeed, numerical algorithms pose several challenges to unit testing in general, and TDD in particular. Among these challenges are the need to have simple, non-redundant closed-form expressions to compare against the results obtained from the implementation as well as realistic error estimates. The necessity for serial and parallel performance raises additional concerns for many scientific applicaitons. In previous work I demonstrated that TDD performed well for the development of a relatively simple numerical model that simulates the growth of snowflakes, but the results were anecdotal and of limited relevance to far more complex software components typical of climate models. This investigation has now been extended by successfully applying TDD to the implementation of a substantial portion of a new parameterized ice sheet component within a full climate model. After a brief introduction to TDD, I will present techniques that address some of the obstacles encountered with numerical algorithms. I will conclude with some quantitative and qualitative comparisons against climate components developed in a more traditional manner.
Cognitive Foundry v. 3.0 (OSS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Basilico, Justin; Dixon, Kevin; McClain, Jonathan
2009-11-18
The Cognitive Foundry is a unified collection of tools designed for research and applications that use cognitive modeling, machine learning, or pattern recognition. The software library contains design patterns, interface definitions, and default implementations of reusable software components and algorithms designed to support a wide variety of research and development needs. The library contains three main software packages: the Common package that contains basic utilities and linear algebraic methods, the Cognitive Framework package that contains tools to assist in implementing and analyzing theories of cognition, and the Machine Learning package that provides general algorithms and methods for populating Cognitive Frameworkmore » components from domain-relevant data.« less
Use of software engineering techniques in the design of the ALEPH data acquisition system
NASA Astrophysics Data System (ADS)
Charity, T.; McClatchey, R.; Harvey, J.
1987-08-01
The SASD methodology is being used to provide a rigorous design framework for various components of the ALEPH data acquisition system. The Entity-Relationship data model is used to describe the layout and configuration of the control and acquisition systems and detector components. State Transition Diagrams are used to specify control applications such as run control and resource management and Data Flow Diagrams assist in decomposing software tasks and defining interfaces between processes. These techniques encourage rigorous software design leading to enhanced functionality and reliability. Improved documentation and communication ensures continuity over the system life-cycle and simplifies project management.
Agent-based models of cellular systems.
Cannata, Nicola; Corradini, Flavio; Merelli, Emanuela; Tesei, Luca
2013-01-01
Software agents are particularly suitable for engineering models and simulations of cellular systems. In a very natural and intuitive manner, individual software components are therein delegated to reproduce "in silico" the behavior of individual components of alive systems at a given level of resolution. Individuals' actions and interactions among individuals allow complex collective behavior to emerge. In this chapter we first introduce the readers to software agents and multi-agent systems, reviewing the evolution of agent-based modeling of biomolecular systems in the last decade. We then describe the main tools, platforms, and methodologies available for programming societies of agents, possibly profiting also of toolkits that do not require advanced programming skills.
Mission Management Computer Software for RLV-TD
NASA Astrophysics Data System (ADS)
Manju, C. R.; Joy, Josna Susan; Vidya, L.; Sheenarani, I.; Sruthy, C. N.; Viswanathan, P. C.; Dinesh, Sudin; Jayalekshmy, L.; Karuturi, Kesavabrahmaji; Sheema, E.; Syamala, S.; Unnikrishnan, S. Manju; Ali, S. Akbar; Paramasivam, R.; Sheela, D. S.; Shukkoor, A. Abdul; Lalithambika, V. R.; Mookiah, T.
2017-12-01
The Mission Management Computer (MMC) software is responsible for the autonomous navigation, sequencing, guidance and control of the Re-usable Launch Vehicle (RLV), through lift-off, ascent, coasting, re-entry, controlled descent and splashdown. A hard real-time system has been designed for handling the mission requirements in an integrated manner and for meeting the stringent timing constraints. Redundancy management and fault-tolerance techniques are also built into the system, in order to achieve a successful mission even in presence of component failures. This paper describes the functions and features of the components of the MMC software which has accomplished the successful RLV-Technology Demonstrator mission.
Development of RT-components for the M-3 Strawberry Harvesting Robot
NASA Astrophysics Data System (ADS)
Yamashita, Tomoki; Tanaka, Motomasa; Yamamoto, Satoshi; Hayashi, Shigehiko; Saito, Sadafumi; Sugano, Shigeki
We are now developing the strawberry harvest robot called “M-3” prototype robot system under the 4th urgent project of MAFF. In order to develop the control software of the M-3 robot more efficiently, we innovated the RT-middleware “OpenRTM-aist” software platform. In this system, we developed 9 kind of RT-Components (RTC): Robot task sequence player RTC, Proxy RTC for image processing software, DC motor controller RTC, Arm kinematics RTC, and so on. In this paper, we discuss advantages of RT-middleware developing system and problems about operating the RTC-configured robotic system by end-users.
Educational Software--New Guidelines for Development.
ERIC Educational Resources Information Center
Gold, Patricia Cohen
1984-01-01
Discusses standards developed by the Educational Computer Service of the National Education Association that incorporate technical, educational, and documentation components to guide authors in the development of quality educational software. (Author/MBR)
NASA Technical Reports Server (NTRS)
Liaw, Morris; Evesson, Donna
1988-01-01
Software Engineering and Ada Database (SEAD) was developed to provide an information resource to NASA and NASA contractors with respect to Ada-based resources and activities which are available or underway either in NASA or elsewhere in the worldwide Ada community. The sharing of such information will reduce duplication of effort while improving quality in the development of future software systems. SEAD data is organized into five major areas: information regarding education and training resources which are relevant to the life cycle of Ada-based software engineering projects such as those in the Space Station program; research publications relevant to NASA projects such as the Space Station Program and conferences relating to Ada technology; the latest progress reports on Ada projects completed or in progress both within NASA and throughout the free world; Ada compilers and other commercial products that support Ada software development; and reusable Ada components generated both within NASA and from elsewhere in the free world. This classified listing of reusable components shall include descriptions of tools, libraries, and other components of interest to NASA. Sources for the data include technical newletters and periodicals, conference proceedings, the Ada Information Clearinghouse, product vendors, and project sponsors and contractors.
Software Health Management: A Short Review of Challenges and Existing Techniques
NASA Technical Reports Server (NTRS)
Pipatsrisawat, Knot; Darwiche, Adnan; Mengshoel, Ole J.; Schumann, Johann
2009-01-01
Modern spacecraft (as well as most other complex mechanisms like aircraft, automobiles, and chemical plants) rely more and more on software, to a point where software failures have caused severe accidents and loss of missions. Software failures during a manned mission can cause loss of life, so there are severe requirements to make the software as safe and reliable as possible. Typically, verification and validation (V&V) has the task of making sure that all software errors are found before the software is deployed and that it always conforms to the requirements. Experience, however, shows that this gold standard of error-free software cannot be reached in practice. Even if the software alone is free of glitches, its interoperation with the hardware (e.g., with sensors or actuators) can cause problems. Unexpected operational conditions or changes in the environment may ultimately cause a software system to fail. Is there a way to surmount this problem? In most modern aircraft and many automobiles, hardware such as central electrical, mechanical, and hydraulic components are monitored by IVHM (Integrated Vehicle Health Management) systems. These systems can recognize, isolate, and identify faults and failures, both those that already occurred as well as imminent ones. With the help of diagnostics and prognostics, appropriate mitigation strategies can be selected (replacement or repair, switch to redundant systems, etc.). In this short paper, we discuss some challenges and promising techniques for software health management (SWHM). In particular, we identify unique challenges for preventing software failure in systems which involve both software and hardware components. We then present our classifications of techniques related to SWHM. These classifications are performed based on dimensions of interest to both developers and users of the techniques, and hopefully provide a map for dealing with software faults and failures.
Why Free Software Matters for Literacy Educators.
ERIC Educational Resources Information Center
Brunelle, Michael D.; Bruce, Bertram C.
2002-01-01
Notes that understanding what "free software" means and its implications for access and use of new technologies is an important component of the new literacies. Concludes that if free speech and free press are essential to the development of a general literacy, then free software can promote the development of computer literacy. (SG)
Active Learning through Modeling: Introduction to Software Development in the Business Curriculum
ERIC Educational Resources Information Center
Roussev, Boris; Rousseva, Yvonna
2004-01-01
Modern software practices call for the active involvement of business people in the software process. Therefore, programming has become an indispensable part of the information systems component of the core curriculum at business schools. In this paper, we present a model-based approach to teaching introduction to programming to general business…
ERIC Educational Resources Information Center
Managan, William H.
1999-01-01
Describes a facilities-management software program that helps managers better document and understand maintenance backlogs, improvements, and future cyclic renewal needs. Major software components are examined including a software tool that filters, groups, and ranks projects to help determine funding requests. (GR)
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-31
... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-852] Certain Video Analytics Software... 337 of the Tariff Act of 1930, as amended, 19 U.S.C. 1337, on behalf of ObjectVideo, Inc. of Reston... sale within the United States after importation of certain video analytics software, systems...
Integrated System for Autonomous Science
NASA Technical Reports Server (NTRS)
Chien, Steve; Sherwood, Robert; Tran, Daniel; Cichy, Benjamin; Davies, Ashley; Castano, Rebecca; Rabideau, Gregg; Frye, Stuart; Trout, Bruce; Shulman, Seth;
2006-01-01
The New Millennium Program Space Technology 6 Project Autonomous Sciencecraft software implements an integrated system for autonomous planning and execution of scientific, engineering, and spacecraft-coordination actions. A prior version of this software was reported in "The TechSat 21 Autonomous Sciencecraft Experiment" (NPO-30784), NASA Tech Briefs, Vol. 28, No. 3 (March 2004), page 33. This software is now in continuous use aboard the Earth Orbiter 1 (EO-1) spacecraft mission and is being adapted for use in the Mars Odyssey and Mars Exploration Rovers missions. This software enables EO-1 to detect and respond to such events of scientific interest as volcanic activity, flooding, and freezing and thawing of water. It uses classification algorithms to analyze imagery onboard to detect changes, including events of scientific interest. Detection of such events triggers acquisition of follow-up imagery. The mission-planning component of the software develops a response plan that accounts for visibility of targets and operational constraints. The plan is then executed under control by a task-execution component of the software that is capable of responding to anomalies.
Automating Risk Analysis of Software Design Models
Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P.
2014-01-01
The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance. PMID:25136688
Automating risk analysis of software design models.
Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P
2014-01-01
The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.
Space-Shuttle Emulator Software
NASA Technical Reports Server (NTRS)
Arnold, Scott; Askew, Bill; Barry, Matthew R.; Leigh, Agnes; Mermelstein, Scott; Owens, James; Payne, Dan; Pemble, Jim; Sollinger, John; Thompson, Hiram;
2007-01-01
A package of software has been developed to execute a raw binary image of the space shuttle flight software for simulation of the computational effects of operation of space shuttle avionics. This software can be run on inexpensive computer workstations. Heretofore, it was necessary to use real flight computers to perform such tests and simulations. The package includes a program that emulates the space shuttle orbiter general- purpose computer [consisting of a central processing unit (CPU), input/output processor (IOP), master sequence controller, and buscontrol elements]; an emulator of the orbiter display electronics unit and models of the associated cathode-ray tubes, keyboards, and switch controls; computational models of the data-bus network; computational models of the multiplexer-demultiplexer components; an emulation of the pulse-code modulation master unit; an emulation of the payload data interleaver; a model of the master timing unit; a model of the mass memory unit; and a software component that ensures compatibility of telemetry and command services between the simulated space shuttle avionics and a mission control center. The software package is portable to several host platforms.
Bánfai, Balázs; Porció, Roland; Kovács, Tibor
2014-01-01
SNOMED CT is a vital component in the future of semantic interoperability in healthcare as it provides the meaning to EHRs via its semantically rich, controlled terminology. Communicating the concepts of this terminology to both humans and machines is crucial therefore formal guidelines for diagram and expression representations have been developed by the curators of SNOMED CT. This paper presents a novel, model-based approach to implementing these guidelines that allows simultaneous editing of a concept via both diagram and expression editors. The implemented extensible software component can be embedded both both desktop and web applications.
Design of a component-based integrated environmental modeling framework
Integrated environmental modeling (IEM) includes interdependent science-based components (e.g., models, databases, viewers, assessment protocols) that comprise an appropriate software modeling system. The science-based components are responsible for consuming and producing inform...
Analyzing and Predicting Effort Associated with Finding and Fixing Software Faults
NASA Technical Reports Server (NTRS)
Hamill, Maggie; Goseva-Popstojanova, Katerina
2016-01-01
Context: Software developers spend a significant amount of time fixing faults. However, not many papers have addressed the actual effort needed to fix software faults. Objective: The objective of this paper is twofold: (1) analysis of the effort needed to fix software faults and how it was affected by several factors and (2) prediction of the level of fix implementation effort based on the information provided in software change requests. Method: The work is based on data related to 1200 failures, extracted from the change tracking system of a large NASA mission. The analysis includes descriptive and inferential statistics. Predictions are made using three supervised machine learning algorithms and three sampling techniques aimed at addressing the imbalanced data problem. Results: Our results show that (1) 83% of the total fix implementation effort was associated with only 20% of failures. (2) Both safety critical failures and post-release failures required three times more effort to fix compared to non-critical and pre-release counterparts, respectively. (3) Failures with fixes spread across multiple components or across multiple types of software artifacts required more effort. The spread across artifacts was more costly than spread across components. (4) Surprisingly, some types of faults associated with later life-cycle activities did not require significant effort. (5) The level of fix implementation effort was predicted with 73% overall accuracy using the original, imbalanced data. Using oversampling techniques improved the overall accuracy up to 77%. More importantly, oversampling significantly improved the prediction of the high level effort, from 31% to around 85%. Conclusions: This paper shows the importance of tying software failures to changes made to fix all associated faults, in one or more software components and/or in one or more software artifacts, and the benefit of studying how the spread of faults and other factors affect the fix implementation effort.
Development of high performance scientific components for interoperability of computing packages
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gulabani, Teena Pratap
2008-01-01
Three major high performance quantum chemistry computational packages, NWChem, GAMESS and MPQC have been developed by different research efforts following different design patterns. The goal is to achieve interoperability among these packages by overcoming the challenges caused by the different communication patterns and software design of each of these packages. A chemistry algorithm is hard to develop as well as being a time consuming process; integration of large quantum chemistry packages will allow resource sharing and thus avoid reinvention of the wheel. Creating connections between these incompatible packages is the major motivation of the proposed work. This interoperability is achievedmore » by bringing the benefits of Component Based Software Engineering through a plug-and-play component framework called Common Component Architecture (CCA). In this thesis, I present a strategy and process used for interfacing two widely used and important computational chemistry methodologies: Quantum Mechanics and Molecular Mechanics. To show the feasibility of the proposed approach the Tuning and Analysis Utility (TAU) has been coupled with NWChem code and its CCA components. Results show that the overhead is negligible when compared to the ease and potential of organizing and coping with large-scale software applications.« less
Zhao, Lei; Lim Choi Keung, Sarah N; Taweel, Adel; Tyler, Edward; Ogunsina, Ire; Rossiter, James; Delaney, Brendan C; Peterson, Kevin A; Hobbs, F D Richard; Arvanitis, Theodoros N
2012-01-01
Heterogeneous data models and coding schemes for electronic health records present challenges for automated search across distributed data sources. This paper describes a loosely coupled software framework based on the terminology controlled approach to enable the interoperation between the search interface and heterogeneous data sources. Software components interoperate via common terminology service and abstract criteria model so as to promote component reuse and incremental system evolution.
Microterminal/Microfiche System for Computer-Based Instruction: Hardware and Software Development.
1980-10-01
Circuit Description and Schematic of Adaptor Module 57 Appendix C Circuit Description The schematics for circuitry used in the microfiche viewer and the...composed of four major components and associated interfaces. The major components are (a) mirroterminal. (Is) microfiche reader. (0) memory module , and (d...sensing of the position of the platen containing the microfiche so that frame locations can be verified by the microterminal software. The memory module is
Hardisty, Frank; Robinson, Anthony C.
2010-01-01
In this paper we present the GeoViz Toolkit, an open-source, internet-delivered program for geographic visualization and analysis that features a diverse set of software components which can be flexibly combined by users who do not have programming expertise. The design and architecture of the GeoViz Toolkit allows us to address three key research challenges in geovisualization: allowing end users to create their own geovisualization and analysis component set on-the-fly, integrating geovisualization methods with spatial analysis methods, and making geovisualization applications sharable between users. Each of these tasks necessitates a robust yet flexible approach to inter-tool coordination. The coordination strategy we developed for the GeoViz Toolkit, called Introspective Observer Coordination, leverages and combines key advances in software engineering from the last decade: automatic introspection of objects, software design patterns, and reflective invocation of methods. PMID:21731423
NASA Technical Reports Server (NTRS)
Boulanger, Richard; Overland, David
2004-01-01
Technologies that facilitate the design and control of complex, hybrid, and resource-constrained systems are examined. This paper focuses on design methodologies, and system architectures, not on specific control methods that may be applied to life support subsystems. Honeywell and Boeing have estimated that 60-80Y0 of the effort in developing complex control systems is software development, and only 20-40% is control system development. It has also been shown that large software projects have failure rates of as high as 50-65%. Concepts discussed include the Unified Modeling Language (UML) and design patterns with the goal of creating a self-improving, self-documenting system design process. Successful architectures for control must not only facilitate hardware to software integration, but must also reconcile continuously changing software with much less frequently changing hardware. These architectures rely on software modules or components to facilitate change. Architecting such systems for change leverages the interfaces between these modules or components.
NASA Technical Reports Server (NTRS)
Briand, Lionel C.; Basili, Victor R.; Hetmanski, Christopher J.
1992-01-01
Applying equal testing and verification effort to all parts of a software system is not very efficient, especially when resources are limited and scheduling is tight. Therefore, one needs to be able to differentiate low/high fault density components so that the testing/verification effort can be concentrated where needed. Such a strategy is expected to detect more faults and thus improve the resulting reliability of the overall system. This paper presents an alternative approach for constructing such models that is intended to fulfill specific software engineering needs (i.e. dealing with partial/incomplete information and creating models that are easy to interpret). Our approach to classification is as follows: (1) to measure the software system to be considered; and (2) to build multivariate stochastic models for prediction. We present experimental results obtained by classifying FORTRAN components developed at the NASA/GSFC into two fault density classes: low and high. Also we evaluate the accuracy of the model and the insights it provides into the software process.
NASA Tech Briefs, July 1997. Volume 21, No. 7
NASA Technical Reports Server (NTRS)
1997-01-01
Topics: Mechanical Components; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Software; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Life Sciences.
Ward, Robert; Dunn, Joie; Clavijo, Leonardo; Shavelle, David; Rowe, Vincent; Woo, Karen
2017-01-01
Background Patients presenting to a public hospital with critical limb ischemia (CLI) typically have advanced disease with significant comorbidities. The purpose of this study was to assess the influence of revascularization on 1-year amputation rate of CLI patients presenting to Los Angeles County USC Medical Center, classified according to the Society for Vascular Surgery Wound, Ischemia and foot Infection (WIfI). Methods A retrospective review of patients who presented to a public hospital with CLI from February 2010 to July 2014 was performed. Patients were classified according to the WIfI system. Only patients with complete data who survived at least 12 months after presentation were included. Results Ninety-three patients with 98 affected limbs were included. The mean age was 62.8 years. Eighty-two patients (84%) had hypertension and 71 (72%) had diabetes. Fifty (57.5%) limbs had Trans-Atlantic Inter-Society Consensus (TASC) C or D femoral–popliteal lesions and 82 (98%) had significant infrapopliteal disease. The majority had moderate or high WIfI amputation and revascularization scores. Eighty-four (86%) limbs underwent open, endovascular, or hybrid revascularization. Overall, one year major amputation (OYMA) rate was 26.5%. In limbs with high WIfI amputation score, the OYMA was 34.5%: 21.4% in those who were revascularized and 57% in those who were not. On univariable analysis, factors associated with increased risk of OYMA were nonrevascularization (P = 0.005), hyperlipidemia (P = 0.06), hemodialysis (P = 0.005), gangrene (P = 0.02), ulcer classification (P = 0.05), WIfI amputation score (P = 0.026), and WIfI wound grade (P = 0.04). On multivariable analysis, increasing WIfI amputation score (odds ratio [OR] 1.84, 95% confidence interval [CI] 1.0–3.39) was associated with increased risk of OYMA while revascularization (OR 0.24, 95% CI 0.07–0.80) was associated with decreased risk of OYMA. Conclusions The OYMA rates in this population were consistent with those predicted by the WIfI classification system. In this population, revascularization significantly reduced the risk of amputation. Comorbidities including diabetes mellitus and TASC classification did not moderate the association of WIfI amputation score with risk of 1-year major amputation. PMID:27546850
Ward, Robert; Dunn, Joie; Clavijo, Leonardo; Shavelle, David; Rowe, Vincent; Woo, Karen
2017-01-01
Patients presenting to a public hospital with critical limb ischemia (CLI) typically have advanced disease with significant comorbidities. The purpose of this study was to assess the influence of revascularization on 1-year amputation rate of CLI patients presenting to Los Angeles County USC Medical Center, classified according to the Society for Vascular Surgery Wound, Ischemia and foot Infection (WIfI). A retrospective review of patients who presented to a public hospital with CLI from February 2010 to July 2014 was performed. Patients were classified according to the WIfI system. Only patients with complete data who survived at least 12 months after presentation were included. Ninety-three patients with 98 affected limbs were included. The mean age was 62.8 years. Eighty-two patients (84%) had hypertension and 71 (72%) had diabetes. Fifty (57.5%) limbs had Trans-Atlantic Inter-Society Consensus (TASC) C or D femoral-popliteal lesions and 82 (98%) had significant infrapopliteal disease. The majority had moderate or high WIfI amputation and revascularization scores. Eighty-four (86%) limbs underwent open, endovascular, or hybrid revascularization. Overall, one year major amputation (OYMA) rate was 26.5%. In limbs with high WIfI amputation score, the OYMA was 34.5%: 21.4% in those who were revascularized and 57% in those who were not. On univariable analysis, factors associated with increased risk of OYMA were nonrevascularization (P = 0.005), hyperlipidemia (P = 0.06), hemodialysis (P = 0.005), gangrene (P = 0.02), ulcer classification (P = 0.05), WIfI amputation score (P = 0.026), and WIfI wound grade (P = 0.04). On multivariable analysis, increasing WIfI amputation score (odds ratio [OR] 1.84, 95% confidence interval [CI] 1.0-3.39) was associated with increased risk of OYMA while revascularization (OR 0.24, 95% CI 0.07-0.80) was associated with decreased risk of OYMA. The OYMA rates in this population were consistent with those predicted by the WIfI classification system. In this population, revascularization significantly reduced the risk of amputation. Comorbidities including diabetes mellitus and TASC classification did not moderate the association of WIfI amputation score with risk of 1-year major amputation. Published by Elsevier Inc.
SAGA: A project to automate the management of software production systems
NASA Technical Reports Server (NTRS)
Campbell, Roy H.; Beckman, Carol S.; Benzinger, Leonora; Beshers, George; Hammerslag, David; Kimball, John; Kirslis, Peter A.; Render, Hal; Richards, Paul; Terwilliger, Robert
1985-01-01
The SAGA system is a software environment that is designed to support most of the software development activities that occur in a software lifecycle. The system can be configured to support specific software development applications using given programming languages, tools, and methodologies. Meta-tools are provided to ease configuration. The SAGA system consists of a small number of software components that are adapted by the meta-tools into specific tools for use in the software development application. The modules are design so that the meta-tools can construct an environment which is both integrated and flexible. The SAGA project is documented in several papers which are presented.
MFV-class: a multi-faceted visualization tool of object classes.
Zhang, Zhi-meng; Pan, Yun-he; Zhuang, Yue-ting
2004-11-01
Classes are key software components in an object-oriented software system. In many industrial OO software systems, there are some classes that have complicated structure and relationships. So in the processes of software maintenance, testing, software reengineering, software reuse and software restructure, it is a challenge for software engineers to understand these classes thoroughly. This paper proposes a class comprehension model based on constructivist learning theory, and implements a software visualization tool (MFV-Class) to help in the comprehension of a class. The tool provides multiple views of class to uncover manifold facets of class contents. It enables visualizing three object-oriented metrics of classes to help users focus on the understanding process. A case study was conducted to evaluate our approach and the toolkit.
C-Based Design Methodology and Topological Change for an Indian Agricultural Tractor Component
NASA Astrophysics Data System (ADS)
Matta, Anil Kumar; Raju, D. Ranga; Suman, K. N. S.; Kranthi, A. S.
2018-06-01
The failure of tractor components and their replacement has now become very common in India because of re-cycling, re-sale, and duplication. To over come the problem of failure we propose a design methodology for topological change co-simulating with software's. In the proposed Design methodology, the designer checks Paxial, Pcr, Pfailue, τ by hand calculations, from which refined topological changes of R.S.Arm are formed. We explained several techniques employed in the component for reduction, removal of rib material to change center of gravity and centroid point by using system C for mixed level simulation and faster topological changes. The design process in system C can be compiled and executed with software, TURBO C7. The modified component is developed in proE and analyzed in ANSYS. The topologically changed component with slot 120 × 4.75 × 32.5 mm at the center showed greater effectiveness than the original component.
C-Based Design Methodology and Topological Change for an Indian Agricultural Tractor Component
NASA Astrophysics Data System (ADS)
Matta, Anil Kumar; Raju, D. Ranga; Suman, K. N. S.; Kranthi, A. S.
2018-02-01
The failure of tractor components and their replacement has now become very common in India because of re-cycling, re-sale, and duplication. To over come the problem of failure we propose a design methodology for topological change co-simulating with software's. In the proposed Design methodology, the designer checks Paxial, Pcr, Pfailue, τ by hand calculations, from which refined topological changes of R.S.Arm are formed. We explained several techniques employed in the component for reduction, removal of rib material to change center of gravity and centroid point by using system C for mixed level simulation and faster topological changes. The design process in system C can be compiled and executed with software, TURBO C7. The modified component is developed in proE and analyzed in ANSYS. The topologically changed component with slot 120 × 4.75 × 32.5 mm at the center showed greater effectiveness than the original component.
Code of Federal Regulations, 2013 CFR
2013-04-01
... limited to software, files, data, and prize schedules. (2) Downloads must use secure methodologies that... date of the completion of the download; (iii) The Class II gaming system components to which software was downloaded; (iv) The version(s) of download package and any software downloaded. Logging of the...
Code of Federal Regulations, 2014 CFR
2014-04-01
... limited to software, files, data, and prize schedules. (2) Downloads must use secure methodologies that... date of the completion of the download; (iii) The Class II gaming system components to which software was downloaded; (iv) The version(s) of download package and any software downloaded. Logging of the...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-18
... Thereof and Associated Software AGENCY: U.S. International Trade Commission. ACTION: Notice, Institution... thereof and associated software, by reason of infringement of U.S. Patent No. 8,028,323 (``the `323 patent... mobile phones, components thereof and associated software by reason of infringement of one or more of...
Applications of Logic Coverage Criteria and Logic Mutation to Software Testing
ERIC Educational Resources Information Center
Kaminski, Garrett K.
2011-01-01
Logic is an important component of software. Thus, software logic testing has enjoyed significant research over a period of decades, with renewed interest in the last several years. One approach to detecting logic faults is to create and execute tests that satisfy logic coverage criteria. Another approach to detecting faults is to perform mutation…
Developing the E-Scape Software System
ERIC Educational Resources Information Center
Derrick, Karim
2012-01-01
Most innovations have contextual pre-cursors that prompt new ways of thinking and in their turn help to give form to the new reality. This was the case with the e-scape software development process. The origins of the system existed in software components and ideas that we had developed through previous projects, but the ultimate direction we took…
SOA Governance: A Critical SOA Success Factor
2010-04-01
Software Perspective Service Consumer Service Providers Interface Optimize tomorrow today. ® Building Blocks...of a SOA Service – Software implemented capability that is well-defined, self contained and does not depend on context or state of other services ... Service Consumer – Service , application or other software component that requires a specific service . – Located through registry – Initiates service
Judicious use of custom development in an open source component architecture
NASA Astrophysics Data System (ADS)
Bristol, S.; Latysh, N.; Long, D.; Tekell, S.; Allen, J.
2014-12-01
Modern software engineering is not as much programming from scratch as innovative assembly of existing components. Seamlessly integrating disparate components into scalable, performant architecture requires sound engineering craftsmanship and can often result in increased cost efficiency and accelerated capabilities if software teams focus their creativity on the edges of the problem space. ScienceBase is part of the U.S. Geological Survey scientific cyberinfrastructure, providing data and information management, distribution services, and analysis capabilities in a way that strives to follow this pattern. ScienceBase leverages open source NoSQL and relational databases, search indexing technology, spatial service engines, numerous libraries, and one proprietary but necessary software component in its architecture. The primary engineering focus is cohesive component interaction, including construction of a seamless Application Programming Interface (API) across all elements. The API allows researchers and software developers alike to leverage the infrastructure in unique, creative ways. Scaling the ScienceBase architecture and core API with increasing data volume (more databases) and complexity (integrated science problems) is a primary challenge addressed by judicious use of custom development in the component architecture. Other data management and informatics activities in the earth sciences have independently resolved to a similar design of reusing and building upon established technology and are working through similar issues for managing and developing information (e.g., U.S. Geoscience Information Network; NASA's Earth Observing System Clearing House; GSToRE at the University of New Mexico). Recent discussions facilitated through the Earth Science Information Partners are exploring potential avenues to exploit the implicit relationships between similar projects for explicit gains in our ability to more rapidly advance global scientific cyberinfrastructure.
Secure Encapsulation and Publication of Biological Services in the Cloud Computing Environment
Zhang, Weizhe; Wang, Xuehui; Lu, Bo; Kim, Tai-hoon
2013-01-01
Secure encapsulation and publication for bioinformatics software products based on web service are presented, and the basic function of biological information is realized in the cloud computing environment. In the encapsulation phase, the workflow and function of bioinformatics software are conducted, the encapsulation interfaces are designed, and the runtime interaction between users and computers is simulated. In the publication phase, the execution and management mechanisms and principles of the GRAM components are analyzed. The functions such as remote user job submission and job status query are implemented by using the GRAM components. The services of bioinformatics software are published to remote users. Finally the basic prototype system of the biological cloud is achieved. PMID:24078906
Secure encapsulation and publication of biological services in the cloud computing environment.
Zhang, Weizhe; Wang, Xuehui; Lu, Bo; Kim, Tai-hoon
2013-01-01
Secure encapsulation and publication for bioinformatics software products based on web service are presented, and the basic function of biological information is realized in the cloud computing environment. In the encapsulation phase, the workflow and function of bioinformatics software are conducted, the encapsulation interfaces are designed, and the runtime interaction between users and computers is simulated. In the publication phase, the execution and management mechanisms and principles of the GRAM components are analyzed. The functions such as remote user job submission and job status query are implemented by using the GRAM components. The services of bioinformatics software are published to remote users. Finally the basic prototype system of the biological cloud is achieved.
High Resolution X-Ray Micro-CT of Ultra-Thin Wall Space Components
NASA Technical Reports Server (NTRS)
Roth, Don J.; Rauser, R. W.; Bowman, Randy R.; Bonacuse, Peter; Martin, Richard E.; Locci, I. E.; Kelley, M.
2012-01-01
A high resolution micro-CT system has been assembled and is being used to provide optimal characterization for ultra-thin wall space components. The Glenn Research Center NDE Sciences Team, using this CT system, has assumed the role of inspection vendor for the Advanced Stirling Convertor (ASC) project at NASA. This article will discuss many aspects of the development of the CT scanning for this type of component, including CT system overview; inspection requirements; process development, software utilized and developed to visualize, process, and analyze results; calibration sample development; results on actual samples; correlation with optical/SEM characterization; CT modeling; and development of automatic flaw recognition software. Keywords: Nondestructive Evaluation, NDE, Computed Tomography, Imaging, X-ray, Metallic Components, Thin Wall Inspection
Rezaei-Hachesu, Peyman; Pesianian, Esmaeil; Mohammadian, Mohsen
2016-02-01
Radiology information system (RIS) in order to reduce workload and improve the quality of services must be well-designed. Heuristic evaluation is one of the methods that understand usability problems with the least time, cost and resources. The aim of present study is to evaluate the usability of RISs in hospitals. This is a cross-sectional descriptive study (2015) that uses heuristic evaluation method to evaluate the usability of RIS used in 3 hospitals of Tabriz city. The data are collected using a standard checklist based on 13 principles of Nielsen Heuristic evaluation method. Usability of RISs was investigated based on the number of components observed from Nielsen principles and problems of usability based on the number of non-observed components as well as non-existent or unrecognizable components. by evaluation of RISs in each of the hospitals 1, 2 and 3, total numbers of observed components were obtained as 173, 202 and 196, respectively. It was concluded that the usability of RISs in the studied population, on average and with observing 190 components of the 291 components related to the 13 principles of Nielsen is 65.41 %. Furthermore, problems of usability were obtained as 26.35%. The established and visible nature of some components such as response time of application, visual feedbacks, colors, view and design and arrangement of software objects cause more attention to these components as principal components in designing UI software. Also, incorrect analysis before system design leads to a lack of attention to secondary needs like Help software and security issues.
NASA Technical Reports Server (NTRS)
Mayer, Richard J.; Blinn, Thomas M.; Mayer, Paula S. D.; Ackley, Keith A.; Crump, Wes; Sanders, Les
1991-01-01
The design of the Framework Processor (FP) component of the Framework Programmable Software Development Platform (FFP) is described. The FFP is a project aimed at combining effective tool and data integration mechanisms with a model of the software development process in an intelligent integrated software development environment. Guided by the model, this Framework Processor will take advantage of an integrated operating environment to provide automated support for the management and control of the software development process so that costly mistakes during the development phase can be eliminated.
Probabilistic, Decision-theoretic Disease Surveillance and Control
Wagner, Michael; Tsui, Fuchiang; Cooper, Gregory; Espino, Jeremy U.; Harkema, Hendrik; Levander, John; Villamarin, Ricardo; Voorhees, Ronald; Millett, Nicholas; Keane, Christopher; Dey, Anind; Razdan, Manik; Hu, Yang; Tsai, Ming; Brown, Shawn; Lee, Bruce Y.; Gallagher, Anthony; Potter, Margaret
2011-01-01
The Pittsburgh Center of Excellence in Public Health Informatics has developed a probabilistic, decision-theoretic system for disease surveillance and control for use in Allegheny County, PA and later in Tarrant County, TX. This paper describes the software components of the system and its knowledge bases. The paper uses influenza surveillance to illustrate how the software components transform data collected by the healthcare system into population level analyses and decision analyses of potential outbreak-control measures. PMID:23569617
The Trial Software version for DEMETER power spectrum files visualization and mapping
NASA Astrophysics Data System (ADS)
Lozbin, Anatoliy; Inchin, Alexander; Shpadi, Maxim
2010-05-01
In the frame of Kazakhstan's Scientific Space System creation for earthquakes precursors research, the hardware and software of DEMETER satellite was investigated. The data processing Software of DEMETER is based on package SWAN under IDL Virtual machine and realizes many features, but we can't find an important tool for the spectrograms analysis - space-time visualization of power spectrum files from electromagnetic devices as ICE and IMSC. For elimination of this problem we have developed Software which is offered to use. The DeSS (DEMETER Spectrogram Software) - it is Software for visualization, analysis and a mapping of power spectrum data from electromagnetic devices ICE and IMSC. The Software primary goal is to give the researcher friendly tool for the analysis of electromagnetic data from DEMETER Satellite for earthquake precursors and other ionosphere events researches. The Input data for DeSS Software is a power spectrum files: - Power spectrum of 1 component of the electric field in the VLF range (APID 1132); - Power spectrum of 1 component of the electric field in the HF range (APID 1134); - Power spectrum of 1 component of the magnetic field in the VLF range (APID 1137). The main features and operations of the software is possible: - various time and frequency filtration; - visualization of time dependence of signal intensity on fixed frequency; - spectral density visualization for fixed frequency range; - spectrogram autosize and smooth spectrogram; - the information in each point of the spectrogram: time, frequency and intensity; - the spectrum information in the separate window, consisting of 4 blocks; - data mapping with 6 range scale. On the map we can browse next information: - satellite orbit; - conjugate point at the satellite altitude; - north conjugate point at the altitude 110 km; - south conjugate point at the altitude 110 km. This is only trial software version to help the researchers and we always ready collaborate with scientists for software improvement. References: 1. D.Lagoutte, J.Y. Brochot, D. de Carvalho, L.Madrias and M. Parrot. DEMETER Microsatellite. Scientific Mission Center. Data product description. DMT-SP-9-CM-6054-LPC. 2. D.Lagoutte, J.Y. Brochot, P.Latremoliere. SWAN - Software for Waveform Analysis. LPCE/NI/003.E - Part 1 (User's guide), Part 2 (Analysis tools), Part 3 (User's project interface).
Atmospheric Science Data Center
2013-04-01
... free of charge from JPL, upon completion of a license agreement. hdfscan software consists of two components - a core hdf file ... at the Jet Propulsion Laboratory. To obtain the license agreement, go to the MISR Science Software web page , read the introductory ...
Enhanced CARES Software Enables Improved Ceramic Life Prediction
NASA Technical Reports Server (NTRS)
Janosik, Lesley A.
1997-01-01
The NASA Lewis Research Center has developed award-winning software that enables American industry to establish the reliability and life of brittle material (e.g., ceramic, intermetallic, graphite) structures in a wide variety of 21st century applications. The CARES (Ceramics Analysis and Reliability Evaluation of Structures) series of software is successfully used by numerous engineers in industrial, academic, and government organizations as an essential element of the structural design and material selection processes. The latest version of this software, CARES/Life, provides a general- purpose design tool that predicts the probability of failure of a ceramic component as a function of its time in service. CARES/Life was recently enhanced by adding new modules designed to improve functionality and user-friendliness. In addition, a beta version of the newly-developed CARES/Creep program (for determining the creep life of monolithic ceramic components) has just been released to selected organizations.
Improving a data-acquisition software system with abstract data type components
NASA Technical Reports Server (NTRS)
Howard, S. D.
1990-01-01
Abstract data types and object-oriented design are active research areas in computer science and software engineering. Much of the interest is aimed at new software development. Abstract data type packages developed for a discontinued software project were used to improve a real-time data-acquisition system under maintenance. The result saved effort and contributed to a significant improvement in the performance, maintainability, and reliability of the Goldstone Solar System Radar Data Acquisition System.
A Framework for Software Reuse in Safety-Critical System of Systems
2008-03-01
environment.8 Pressman , on the other hand, defines a software component as a unit of composition with contractually specified and explicit context...2005, p654. 9 R.S. Pressman ., Software Engineering A Practitioner’s Approach, Sixth Edition, New York, NY.: McGraw-Hill, 2005, p817. 10 W.C. Lim...index.php. 79 Pressman , R.S., Software Engineering A Practitioner’s Approach, Sixth Edition, New York, NY.: McGraw-Hill, 2005. Radio Technical
An Architecture, System Engineering, and Acquisition Approach for Space System Software Resiliency
NASA Astrophysics Data System (ADS)
Phillips, Dewanne Marie
Software intensive space systems can harbor defects and vulnerabilities that may enable external adversaries or malicious insiders to disrupt or disable system functions, risking mission compromise or loss. Mitigating this risk demands a sustained focus on the security and resiliency of the system architecture including software, hardware, and other components. Robust software engineering practices contribute to the foundation of a resilient system so that the system "can take a hit to a critical component and recover in a known, bounded, and generally acceptable period of time". Software resiliency must be a priority and addressed early in the life cycle development to contribute a secure and dependable space system. Those who develop, implement, and operate software intensive space systems must determine the factors and systems engineering practices to address when investing in software resiliency. This dissertation offers methodical approaches for improving space system resiliency through software architecture design, system engineering, increased software security, thereby reducing the risk of latent software defects and vulnerabilities. By providing greater attention to the early life cycle phases of development, we can alter the engineering process to help detect, eliminate, and avoid vulnerabilities before space systems are delivered. To achieve this objective, this dissertation will identify knowledge, techniques, and tools that engineers and managers can utilize to help them recognize how vulnerabilities are produced and discovered so that they can learn to circumvent them in future efforts. We conducted a systematic review of existing architectural practices, standards, security and coding practices, various threats, defects, and vulnerabilities that impact space systems from hundreds of relevant publications and interviews of subject matter experts. We expanded on the system-level body of knowledge for resiliency and identified a new software architecture framework and acquisition methodology to improve the resiliency of space systems from a software perspective with an emphasis on the early phases of the systems engineering life cycle. This methodology involves seven steps: 1) Define technical resiliency requirements, 1a) Identify standards/policy for software resiliency, 2) Develop a request for proposal (RFP)/statement of work (SOW) for resilient space systems software, 3) Define software resiliency goals for space systems, 4) Establish software resiliency quality attributes, 5) Perform architectural tradeoffs and identify risks, 6) Conduct architecture assessments as part of the procurement process, and 7) Ascertain space system software architecture resiliency metrics. Data illustrates that software vulnerabilities can lead to opportunities for malicious cyber activities, which could degrade the space mission capability for the user community. Reducing the number of vulnerabilities by improving architecture and software system engineering practices can contribute to making space systems more resilient. Since cyber-attacks are enabled by shortfalls in software, robust software engineering practices and an architectural design are foundational to resiliency, which is a quality that allows the system to "take a hit to a critical component and recover in a known, bounded, and generally acceptable period of time". To achieve software resiliency for space systems, acquirers and suppliers must identify relevant factors and systems engineering practices to apply across the lifecycle, in software requirements analysis, architecture development, design, implementation, verification and validation, and maintenance phases.
O'Connor, B P
2000-08-01
Popular statistical software packages do not have the proper procedures for determining the number of components in factor and principal components analyses. Parallel analysis and Velicer's minimum average partial (MAP) test are validated procedures, recommended widely by statisticians. However, many researchers continue to use alternative, simpler, but flawed procedures, such as the eigenvalues-greater-than-one rule. Use of the proper procedures might be increased if these procedures could be conducted within familiar software environments. This paper describes brief and efficient programs for using SPSS and SAS to conduct parallel analyses and the MAP test.
Selecting reusable components using algebraic specifications
NASA Technical Reports Server (NTRS)
Eichmann, David A.
1992-01-01
A significant hurdle confronts the software reuser attempting to select candidate components from a software repository - discriminating between those components without resorting to inspection of the implementation(s). We outline a mixed classification/axiomatic approach to this problem based upon our lattice-based faceted classification technique and Guttag and Horning's algebraic specification techniques. This approach selects candidates by natural language-derived classification, by their interfaces, using signatures, and by their behavior, using axioms. We briefly outline our problem domain and related work. Lattice-based faceted classifications are described; the reader is referred to surveys of the extensive literature for algebraic specification techniques. Behavioral support for reuse queries is presented, followed by the conclusions.
The integration of the risk management process with the lifecycle of medical device software.
Pecoraro, F; Luzi, D
2014-01-01
The application of software in the Medical Device (MD) domain has become central to the improvement of diagnoses and treatments. The new European regulations that specifically address software as an important component of MD, require complex procedures to make software compliant with safety requirements, introducing thereby new challenges in the qualification and classification of MD software as well as in the performance of risk management activities. Under this perspective, the aim of this paper is to propose an integrated framework that combines the activities to be carried out by the manufacturer to develop safe software within the development lifecycle based on the regulatory requirements reported in US and European regulations as well as in the relevant standards and guidelines. A comparative analysis was carried out to identify the main issues related to the application of the current new regulations. In addition, standards and guidelines recently released to harmonise procedures for the validation of MD software have been used to define the risk management activities to be carried out by the manufacturer during the software development process. This paper highlights the main issues related to the qualification and classification of MD software, providing an analysis of the different regulations applied in Europe and the US. A model that integrates the risk management process within the software development lifecycle has been proposed too. It is based on regulatory requirements and considers software risk analysis as a central input to be managed by the manufacturer already at the initial stages of the software design, in order to prevent MD failures. Relevant changes in the process of MD development have been introduced with the recognition of software being an important component of MDs as stated in regulations and standards. This implies the performance of highly iterative processes that have to integrate the risk management in the framework of software development. It also makes it necessary to involve both medical and software engineering competences to safeguard patient and user safety.
Hufnagel, S; Harbison, K; Silva, J; Mettala, E
1994-01-01
This paper describes a new method for the evolutionary determination of user requirements and system specifications called scenario-based engineering process (SEP). Health care professional workstations are critical components of large scale health care system architectures. We suggest that domain-specific software architectures (DSSAs) be used to specify standard interfaces and protocols for reusable software components throughout those architectures, including workstations. We encourage the use of engineering principles and abstraction mechanisms. Engineering principles are flexible guidelines, adaptable to particular situations. Abstraction mechanisms are simplifications for management of complexity. We recommend object-oriented design principles, graphical structural specifications, and formal components' behavioral specifications. We give an ambulatory care scenario and associated models to demonstrate SEP. The scenario uses health care terminology and gives patients' and health care providers' system views. Our goal is to have a threefold benefit. (i) Scenario view abstractions provide consistent interdisciplinary communications. (ii) Hierarchical object-oriented structures provide useful abstractions for reuse, understandability, and long term evolution. (iii) SEP and health care DSSA integration into computer aided software engineering (CASE) environments. These environments should support rapid construction and certification of individualized systems, from reuse libraries.
Process Management inside ATLAS DAQ
NASA Astrophysics Data System (ADS)
Alexandrov, I.; Amorim, A.; Badescu, E.; Burckhart-Chromek, D.; Caprini, M.; Dobson, M.; Duval, P. Y.; Hart, R.; Jones, R.; Kazarov, A.; Kolos, S.; Kotov, V.; Liko, D.; Lucio, L.; Mapelli, L.; Mineev, M.; Moneta, L.; Nassiakou, M.; Pedro, L.; Ribeiro, A.; Roumiantsev, V.; Ryabov, Y.; Schweiger, D.; Soloviev, I.; Wolters, H.
2002-10-01
The Process Management component of the online software of the future ATLAS experiment data acquisition system is presented. The purpose of the Process Manager is to perform basic job control of the software components of the data acquisition system. It is capable of starting, stopping and monitoring the status of those components on the data acquisition processors independent of the underlying operating system. Its architecture is designed on the basis of a server client model using CORBA based communication. The server part relies on C++ software agent objects acting as an interface between the local operating system and client applications. Some of the major design challenges of the software agents were to achieve the maximum degree of autonomy possible, to create processes aware of dynamic conditions in their environment and with the ability to determine corresponding actions. Issues such as the performance of the agents in terms of time needed for process creation and destruction, the scalability of the system taking into consideration the final ATLAS configuration and minimizing the use of hardware resources were also of critical importance. Besides the details given on the architecture and the implementation, we also present scalability and performance tests results of the Process Manager system.
yourSky: Custom Sky-Image Mosaics via the Internet
NASA Technical Reports Server (NTRS)
Jacob, Joseph
2003-01-01
yourSky (http://yourSky.jpl.nasa.gov) is a computer program that supplies custom astronomical image mosaics of sky regions specified by requesters using client computers connected to the Internet. [yourSky is an upgraded version of the software reported in Software for Generating Mosaics of Astronomical Images (NPO-21121), NASA Tech Briefs, Vol. 25, No. 4 (April 2001), page 16a.] A requester no longer has to engage in the tedious process of determining what subset of images is needed, nor even to know how the images are indexed in image archives. Instead, in response to a requester s specification of the size and location of the sky area, (and optionally of the desired set and type of data, resolution, coordinate system, projection, and image format), yourSky automatically retrieves the component image data from archives totaling tens of terabytes stored on computer tape and disk drives at multiple sites and assembles the component images into a mosaic image by use of a high-performance parallel code. yourSky runs on the server computer where the mosaics are assembled. Because yourSky includes a Web-interface component, no special client software is needed: ordinary Web browser software is sufficient.
NASA Technical Reports Server (NTRS)
Lee, Alice T.; Gunn, Todd; Pham, Tuan; Ricaldi, Ron
1994-01-01
This handbook documents the three software analysis processes the Space Station Software Analysis team uses to assess space station software, including their backgrounds, theories, tools, and analysis procedures. Potential applications of these analysis results are also presented. The first section describes how software complexity analysis provides quantitative information on code, such as code structure and risk areas, throughout the software life cycle. Software complexity analysis allows an analyst to understand the software structure, identify critical software components, assess risk areas within a software system, identify testing deficiencies, and recommend program improvements. Performing this type of analysis during the early design phases of software development can positively affect the process, and may prevent later, much larger, difficulties. The second section describes how software reliability estimation and prediction analysis, or software reliability, provides a quantitative means to measure the probability of failure-free operation of a computer program, and describes the two tools used by JSC to determine failure rates and design tradeoffs between reliability, costs, performance, and schedule.
ERIC Educational Resources Information Center
Scott, Elsje; Zadirov, Alexander; Feinberg, Sean; Jayakody, Ruwanga
2004-01-01
Software testing is a crucial component in the development of good quality systems in industry. For this reason it was considered important to investigate the extent to which the Information Systems (IS) syllabus at the University of Cape Town (UCT) was aligned with accepted software testing practices in South Africa. For students to be effective…
User Documentation for Multiple Software Releases
NASA Technical Reports Server (NTRS)
Humphrey, R.
1982-01-01
In proposed solution to problems of frequent software releases and updates, documentation would be divided into smaller packages, each of which contains data relating to only one of several software components. Changes would not affect entire document. Concept would improve dissemination of information regarding changes and would improve quality of data supporting packages. Would help to insure both timeliness and more thorough scrutiny of changes.
Integrated design optimization research and development in an industrial environment
NASA Astrophysics Data System (ADS)
Kumar, V.; German, Marjorie D.; Lee, S.-J.
1989-04-01
An overview is given of a design optimization project that is in progress at the GE Research and Development Center for the past few years. The objective of this project is to develop a methodology and a software system for design automation and optimization of structural/mechanical components and systems. The effort focuses on research and development issues and also on optimization applications that can be related to real-life industrial design problems. The overall technical approach is based on integration of numerical optimization techniques, finite element methods, CAE and software engineering, and artificial intelligence/expert systems (AI/ES) concepts. The role of each of these engineering technologies in the development of a unified design methodology is illustrated. A software system DESIGN-OPT has been developed for both size and shape optimization of structural components subjected to static as well as dynamic loadings. By integrating this software with an automatic mesh generator, a geometric modeler and an attribute specification computer code, a software module SHAPE-OPT has been developed for shape optimization. Details of these software packages together with their applications to some 2- and 3-dimensional design problems are described.
Development of software for computing forming information using a component based approach
NASA Astrophysics Data System (ADS)
Ko, Kwang Hee; Park, Jiing Seo; Kim, Jung; Kim, Young Bum; Shin, Jong Gye
2009-12-01
In shipbuilding industry, the manufacturing technology> has advanced at an unprecedented pace for the last decade. As a result, many automatic systems for cutting, welding, etc. have been developed and employed in the manufacturing process and accordingly the productivity has been increased drastically. Despite such improvement in the manufacturing technology', however, development of an automatic system for fabricating a curved hull plate remains at the beginning stage since hardware and software for the automation of the curved hull fabrication process should be developed differently depending on the dimensions of plates, forming methods and manufacturing processes of each shipyard. To deal with this problem, it is necessary> to create a "plug-in ''framework, which can adopt various kinds of hardware and software to construct a full automatic fabrication system. In this paper, a frame-work for automatic fabrication of curved hull plates is proposed, which consists of four components and related software. In particular the software module for computing fabrication information is developed by using the ooCBD development methodology; which can interface with other hardware and software with minimum effort. Examples of the proposed framework applied to medium and large shipyards are presented.
Integrated design optimization research and development in an industrial environment
NASA Technical Reports Server (NTRS)
Kumar, V.; German, Marjorie D.; Lee, S.-J.
1989-01-01
An overview is given of a design optimization project that is in progress at the GE Research and Development Center for the past few years. The objective of this project is to develop a methodology and a software system for design automation and optimization of structural/mechanical components and systems. The effort focuses on research and development issues and also on optimization applications that can be related to real-life industrial design problems. The overall technical approach is based on integration of numerical optimization techniques, finite element methods, CAE and software engineering, and artificial intelligence/expert systems (AI/ES) concepts. The role of each of these engineering technologies in the development of a unified design methodology is illustrated. A software system DESIGN-OPT has been developed for both size and shape optimization of structural components subjected to static as well as dynamic loadings. By integrating this software with an automatic mesh generator, a geometric modeler and an attribute specification computer code, a software module SHAPE-OPT has been developed for shape optimization. Details of these software packages together with their applications to some 2- and 3-dimensional design problems are described.
DOIDB: Reusing DataCite's search software as metadata portal for GFZ Data Services
NASA Astrophysics Data System (ADS)
Elger, K.; Ulbricht, D.; Bertelmann, R.
2016-12-01
GFZ Data Services is the central service point for the publication of research data at the Helmholtz Centre Potsdam GFZ German Research Centre for Geosciences (GFZ). It provides data publishing services to scientists of GFZ, associated projects, and associated institutions. The publishing services aim to make research data and physical samples visible and citable, by assigning persistent identifiers (DOI, IGSN) and by complementing existing IT infrastructure. To integrate several research domains a modular software stack that is made of free software components has been created to manage data and metadata as well as register persistent identifiers [1]. Pivotal component for the registration of DOIs is the DOIDB. It has been derived from three software components provided by DataCite [2] that moderate the registration of DOIs and the deposition of metadata, allow the dissemination of metadata, and provide a user interface to navigate and discover datasets. The DOIDB acts as a proxy to the DataCite infrastructure and in addition to the DataCite metadata schema, it allows to deposit and disseminate metadata following the schemas ISO19139 and NASA GCMD DIF. The search component has been modified to meet the requirements of a geosciences metadata portal. In particular, the search component has been altered to make use of Apache SOLRs capability to index and query spatial coordinates. Furthermore, the user interface has been adjusted to provide a first impression of the data by showing a map, summary information and subjects. DOIDB and its components are available on GitHub [3].We present a software solution for registration of DOIs that allows to integrate existing data systems, keeps track of registered DOIs, and provides a metadata portal to discover datasets [4]. [1] Ulbricht, D.; Elger, K.; Bertelmann, R.; Klump, J. panMetaDocs, eSciDoc, and DOIDB—An Infrastructure for the Curation and Publication of File-Based Datasets for GFZ Data Services. ISPRS Int. J. Geo-Inf. 2016, 5, 25. http://doi.org/10.3390/ijgi5030025[2] https://github.com/datacite[3] https://github.com/ulbricht/search/tree/doidb , https://github.com/ulbricht/mds/tree/doidb , https://github.com/ulbricht/oaip/tree/doidb[4] http://doidb.wdc-terra.org
21 CFR 892.2050 - Picture archiving and communications system.
Code of Federal Regulations, 2014 CFR
2014-04-01
... processing of medical images. Its hardware components may include workstations, digitizers, communications... hardcopy devices. The software components may provide functions for performing operations related to image...
21 CFR 892.2050 - Picture archiving and communications system.
Code of Federal Regulations, 2011 CFR
2011-04-01
... processing of medical images. Its hardware components may include workstations, digitizers, communications... hardcopy devices. The software components may provide functions for performing operations related to image...
21 CFR 892.2050 - Picture archiving and communications system.
Code of Federal Regulations, 2012 CFR
2012-04-01
... processing of medical images. Its hardware components may include workstations, digitizers, communications... hardcopy devices. The software components may provide functions for performing operations related to image...
21 CFR 892.2050 - Picture archiving and communications system.
Code of Federal Regulations, 2013 CFR
2013-04-01
... processing of medical images. Its hardware components may include workstations, digitizers, communications... hardcopy devices. The software components may provide functions for performing operations related to image...
Predicting Software Suitability Using a Bayesian Belief Network
NASA Technical Reports Server (NTRS)
Beaver, Justin M.; Schiavone, Guy A.; Berrios, Joseph S.
2005-01-01
The ability to reliably predict the end quality of software under development presents a significant advantage for a development team. It provides an opportunity to address high risk components earlier in the development life cycle, when their impact is minimized. This research proposes a model that captures the evolution of the quality of a software product, and provides reliable forecasts of the end quality of the software being developed in terms of product suitability. Development team skill, software process maturity, and software problem complexity are hypothesized as driving factors of software product quality. The cause-effect relationships between these factors and the elements of software suitability are modeled using Bayesian Belief Networks, a machine learning method. This research presents a Bayesian Network for software quality, and the techniques used to quantify the factors that influence and represent software quality. The developed model is found to be effective in predicting the end product quality of small-scale software development efforts.
75 FR 25185 - Broadband Initiatives Program
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-07
..., excluding desktop or laptop computers, computer hardware and software (including anti-virus, anti-spyware, and other security software), audio or video equipment, computer network components... 10 desktop or laptop computers and individual workstations to be located within the rural library...
"HIP" new software: The Hydroecological Integrity Assessment Process
Henriksen, Jim; Wilson, Juliette T.
2006-01-01
Center (FORT) have developed the Hydroecological Integrity Assessment Process (HIP) and a suite of software tools for conducting a hydrologic classification of streams, addressing instream flow needs, and assessing past and proposed hydrologic alterations on streamflow and other ecosystem components. The HIP recognizes that streamflow is strongly related to many critical physiochemical components of rivers, such as dissolved oxygen, channel geomorphology, and habitats. Streamflow is considered a “master variable” that limits the distribution, abundance, and diversity of many aquatic plant and animal species.
Abstracted Workow Framework with a Structure from Motion Application
NASA Astrophysics Data System (ADS)
Rossi, Adam J.
In scientific and engineering disciplines, from academia to industry, there is an increasing need for the development of custom software to perform experiments, construct systems, and develop products. The natural mindset initially is to shortcut and bypass all overhead and process rigor in order to obtain an immediate result for the problem at hand, with the misconception that the software will simply be thrown away at the end. In a majority of the cases, it turns out the software persists for many years, and likely ends up in production systems for which it was not initially intended. In the current study, a framework that can be used in both industry and academic applications mitigates underlying problems associated with developing scientific and engineering software. This results in software that is much more maintainable, documented, and usable by others, specifically allowing new users to extend capabilities of components already implemented in the framework. There is a multi-disciplinary need in the fields of imaging science, computer science, and software engineering for a unified implementation model, which motivates the development of an abstracted software framework. Structure from motion (SfM) has been identified as one use case where the abstracted workflow framework can improve research efficiencies and eliminate implementation redundancies in scientific fields. The SfM process begins by obtaining 2D images of a scene from different perspectives. Features from the images are extracted and correspondences are established. This provides a sufficient amount of information to initialize the problem for fully automated processing. Transformations are established between views, and 3D points are established via triangulation algorithms. The parameters for the camera models for all views / images are solved through bundle adjustment, establishing a highly consistent point cloud. The initial sparse point cloud and camera matrices are used to generate a dense point cloud through patch based techniques or densification algorithms such as Semi-Global Matching (SGM). The point cloud can be visualized or exploited by both humans and automated techniques. In some cases the point cloud is "draped" with original imagery in order to enhance the 3D model for a human viewer. The SfM workflow can be implemented in the abstracted framework, making it easily leverageable and extensible by multiple users. Like many processes in scientific and engineering domains, the workflow described for SfM is complex and requires many disparate components to form a functional system, often utilizing algorithms implemented by many users in different languages / environments and without knowledge of how the component fits into the larger system. In practice, this generally leads to issues interfacing the components, building the software for desired platforms, understanding its concept of operations, and how it can be manipulated in order to fit the desired function for a particular application. In addition, other scientists and engineers instinctively wish to analyze the performance of the system, establish new algorithms, optimize existing processes, and establish new functionality based on current research. This requires a framework whereby new components can be easily plugged in without affecting the current implemented functionality. The need for a universal programming environment establishes the motivation for the development of the abstracted workflow framework. This software implementation, named Catena, provides base classes from which new components must derive in order to operate within the framework. The derivation mandates requirements be satisfied in order to provide a complete implementation. Additionally, the developer must provide documentation of the component in terms of its overall function and inputs. The interface input and output values corresponding to the component must be defined in terms of their respective data types, and the implementation uses mechanisms within the framework to retrieve and send the values. This process requires the developer to componentize their algorithm rather than implement it monolithically. Although the requirements of the developer are slightly greater, the benefits realized from using Catena far outweigh the overhead, and results in extensible software. This thesis provides a basis for the abstracted workflow framework concept and the Catena software implementation. The benefits are also illustrated using a detailed examination of the SfM process as an example application.
Baad-Hansen, Thomas; Kold, Søren; Kaptein, Bart L; Søballe, Kjeld
2007-08-01
In RSA, tantalum markers attached to metal-backed acetabular cups are often difficult to detect on stereo radiographs due to the high density of the metal shell. This results in occlusion of the prosthesis markers and may lead to inconclusive migration results. Within the last few years, new software systems have been developed to solve this problem. We compared the precision of 3 RSA systems in migration analysis of the acetabular component. A hemispherical and a non-hemispherical acetabular component were mounted in a phantom. Both acetabular components underwent migration analyses with 3 different RSA systems: conventional RSA using tantalum markers, an RSA system using a hemispherical cup algorithm, and a novel model-based RSA system. We found narrow confidence intervals, indicating high precision of the conventional marker system and model-based RSA with regard to migration and rotation. The confidence intervals of conventional RSA and model-based RSA were narrower than those of the hemispherical cup algorithm-based system regarding cup migration and rotation. The model-based RSA software combines the precision of the conventional RSA software with the convenience of the hemispherical cup algorithm-based system. Based on our findings, we believe that these new tools offer an improvement in the measurement of acetabular component migration.
Probabilistic Design and Analysis Framework
NASA Technical Reports Server (NTRS)
Strack, William C.; Nagpal, Vinod K.
2010-01-01
PRODAF is a software package designed to aid analysts and designers in conducting probabilistic analysis of components and systems. PRODAF can integrate multiple analysis programs to ease the tedious process of conducting a complex analysis process that requires the use of multiple software packages. The work uses a commercial finite element analysis (FEA) program with modules from NESSUS to conduct a probabilistic analysis of a hypothetical turbine blade, disk, and shaft model. PRODAF applies the response surface method, at the component level, and extrapolates the component-level responses to the system level. Hypothetical components of a gas turbine engine are first deterministically modeled using FEA. Variations in selected geometrical dimensions and loading conditions are analyzed to determine the effects of the stress state within each component. Geometric variations include the cord length and height for the blade, inner radius, outer radius, and thickness, which are varied for the disk. Probabilistic analysis is carried out using developing software packages like System Uncertainty Analysis (SUA) and PRODAF. PRODAF was used with a commercial deterministic FEA program in conjunction with modules from the probabilistic analysis program, NESTEM, to perturb loads and geometries to provide a reliability and sensitivity analysis. PRODAF simplified the handling of data among the various programs involved, and will work with many commercial and opensource deterministic programs, probabilistic programs, or modules.
CEMENTITIOUS BARRIERS PARTNERSHIP FY13 MID-YEAR REPORT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burns, H.; Flach, G.; Langton, C.
2013-05-01
In FY2013, the Cementitious Barriers Partnership (CBP) is continuing in its effort to develop and enhance software tools demonstrating tangible progress toward fulfilling the objective of developing a set of tools to improve understanding and prediction of the long-term structural, hydraulic and chemical performance of cementitious barriers used in nuclear applications. In FY2012, the CBP released the initial inhouse “Beta-version” of the CBP Software Toolbox, a suite of software for simulating reactive transport in cementitious materials and important degradation phenomena. The current primary software components are LeachXS/ORCHESTRA, STADIUM, and a GoldSim interface for probabilistic analysis of selected degradation scenarios. THAMESmore » is a planned future CBP Toolbox component (FY13/14) focused on simulation of the microstructure of cementitious materials and calculation of resultant hydraulic and constituent mass transfer parameters needed in modeling. This past November, the CBP Software Toolbox Version 1.0 was released that supports analysis of external sulfate attack (including damage mechanics), carbonation, and primary constituent leaching. The LeachXS component embodies an extensive material property measurements database along with chemical speciation and reactive mass transport simulation cases with emphasis on leaching of major, trace and radionuclide constituents from cementitious materials used in DOE facilities, such as Saltstone (Savannah River) and Cast Stone (Hanford), tank closure grouts, and barrier concretes. STADIUM focuses on the physical and structural service life of materials and components based on chemical speciation and reactive mass transport of major cement constituents and aggressive species (e.g., chloride, sulfate, etc.). The CBP issued numerous reports and other documentation that accompanied the “Version 1.0” release including a CBP Software Toolbox User Guide and Installation Guide. These documents, as well as, the presentations from the CBP Software Toolbox Demonstration and User Workshop, which are briefly described below, can be accessed from the CBP webpage at http://cementbarriers.org/. The website was recently modified to describe the CBP Software Toolbox and includes an interest form for application to use the software. The CBP FY13 program is continuing research to improve and enhance the simulation tools as well as develop new tools that model other key degradation phenomena not addressed in Version 1.0. Also efforts to continue to verify the various simulation tools thru laboratory experiments and analysis of field specimens are ongoing to quantify and reduce the uncertainty associated with performance assessments are ongoing. This mid-year report also includes both a summary on the FY13 software accomplishments in addition to the release of Version 1.0 of the CBP Software Toolbox and the various experimental programs that are providing data for calibration and validation of the CBP developed software. The focus this year for experimental studies was to measure transport in cementitious material by utilization of a leaching method and reduction capacity of saltstone field samples. Results are being used to calibrate and validate the updated carbonation model.« less
ControlShell - A real-time software framework
NASA Technical Reports Server (NTRS)
Schneider, Stanley A.; Ullman, Marc A.; Chen, Vincent W.
1991-01-01
ControlShell is designed to enable modular design and impplementation of real-time software. It is an object-oriented tool-set for real-time software system programming. It provides a series of execution and data interchange mechansims that form a framework for building real-time applications. These mechanisms allow a component-based approach to real-time software generation and mangement. By defining a set of interface specifications for intermodule interaction, ControlShell provides a common platform that is the basis for real-time code development and exchange.
Software Framework for Peer Data-Management Services
NASA Technical Reports Server (NTRS)
Hughes, John; Hardman, Sean; Crichton, Daniel; Hyon, Jason; Kelly, Sean; Tran, Thuy
2007-01-01
Object Oriented Data Technology (OODT) is a software framework for creating a Web-based system for exchange of scientific data that are stored in diverse formats on computers at different sites under the management of scientific peers. OODT software consists of a set of cooperating, distributed peer components that provide distributed peer-to-peer (P2P) services that enable one peer to search and retrieve data managed by another peer. In effect, computers running OODT software at different locations become parts of an integrated data-management system.
Retina Image Screening and Analysis Software Version 2.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tobin, Jr., Kenneth W.; Karnowski, Thomas P.; Aykac, Deniz
2009-04-01
The software allows physicians or researchers to ground-truth images of retinas, identifying key physiological features and lesions that are indicative of disease. The software features methods to automatically detect the physiological features and lesions. The software contains code to measure the quality of images received from a telemedicine network; create and populate a database for a telemedicine network; review and report the diagnosis of a set of images; and also contains components to transmit images from a Zeiss camera to the network through SFTP.
Image watermarking against lens flare effects
NASA Astrophysics Data System (ADS)
Chotikawanid, Piyanart; Amornraksa, Thumrongrat
2017-02-01
Lens flare effects in various photo and camera software nowadays can partially or fully damage the watermark information within the watermarked image. We propose in this paper a spatial domain based image watermarking against lens flare effects. The watermark embedding is based on the modification of the saturation color component in HSV color space of a host image. For watermark extraction, a homomorphic filter is used to predict the original embedding component from the watermarked component, and the watermark is blindly recovered by differentiating both components. The watermarked image's quality is evaluated by wPSNR, while the extracted watermark's accuracy is evaluated by NC. The experimental results against various types of lens flare effects from both computer software and mobile application showed that our proposed method outperformed the previous methods.
NASA Technical Reports Server (NTRS)
Taylor, Nancy L.; Randall, Donald P.; Bowen, John T.; Johnson, Mary M.; Roland, Vincent R.; Matthews, Christine G.; Gates, Raymond L.; Skeens, Kristi M.; Nolf, Scott R.; Hammond, Dana P.
1990-01-01
The computer graphics capabilities available at the Center are introduced and their use is explained. More specifically, the manual identifies and describes the various graphics software and hardware components, details the interfaces between these components, and provides information concerning the use of these components at LaRC.
Current state of the mass storage system reference model
NASA Technical Reports Server (NTRS)
Coyne, Robert
1993-01-01
IEEE SSSWG was chartered in May 1990 to abstract the hardware and software components of existing and emerging storage systems and to define the software interfaces between these components. The immediate goal is the decomposition of a storage system into interoperable functional modules which vendors can offer as separate commercial products. The ultimate goal is to develop interoperable standards which define the software interfaces, and in the distributed case, the associated protocols to each of the architectural modules in the model. The topics are presented in viewgraph form and include the following: IEEE SSSWG organization; IEEE SSSWG subcommittees & chairs; IEEE standards activity board; layered view of the reference model; layered access to storage services; IEEE SSSWG emphasis; and features for MSSRM version 5.
MSAViewer: interactive JavaScript visualization of multiple sequence alignments.
Yachdav, Guy; Wilzbach, Sebastian; Rauscher, Benedikt; Sheridan, Robert; Sillitoe, Ian; Procter, James; Lewis, Suzanna E; Rost, Burkhard; Goldberg, Tatyana
2016-11-15
The MSAViewer is a quick and easy visualization and analysis JavaScript component for Multiple Sequence Alignment data of any size. Core features include interactive navigation through the alignment, application of popular color schemes, sorting, selecting and filtering. The MSAViewer is 'web ready': written entirely in JavaScript, compatible with modern web browsers and does not require any specialized software. The MSAViewer is part of the BioJS collection of components. The MSAViewer is released as open source software under the Boost Software License 1.0. Documentation, source code and the viewer are available at http://msa.biojs.net/Supplementary information: Supplementary data are available at Bioinformatics online. msa@bio.sh. © The Author 2016. Published by Oxford University Press.
MSAViewer: interactive JavaScript visualization of multiple sequence alignments
Yachdav, Guy; Wilzbach, Sebastian; Rauscher, Benedikt; Sheridan, Robert; Sillitoe, Ian; Procter, James; Lewis, Suzanna E.; Rost, Burkhard; Goldberg, Tatyana
2016-01-01
Summary: The MSAViewer is a quick and easy visualization and analysis JavaScript component for Multiple Sequence Alignment data of any size. Core features include interactive navigation through the alignment, application of popular color schemes, sorting, selecting and filtering. The MSAViewer is ‘web ready’: written entirely in JavaScript, compatible with modern web browsers and does not require any specialized software. The MSAViewer is part of the BioJS collection of components. Availability and Implementation: The MSAViewer is released as open source software under the Boost Software License 1.0. Documentation, source code and the viewer are available at http://msa.biojs.net/. Supplementary information: Supplementary data are available at Bioinformatics online. Contact: msa@bio.sh PMID:27412096
NASA Technical Reports Server (NTRS)
Johnson, Charles S.
1986-01-01
It is nearly axiomatic, that to take the greatest advantage of the useful features available in a development system, and to avoid the negative interactions of those features, requires the exercise of a design methodology which constrains their use. A major design support feature of the Ada language is abstraction: for data, functions processes, resources, and system elements in general. Atomic abstract types can be created in packages defining those private types and all of the overloaded operators, functions, and hidden data required for their use in an application. Generically structured abstract types can be created in generic packages defining those structured private types, as buildups from the user-defined data types which are input as parameters. A study is made of the design constraints required for software incorporating either atomic or generically structured abstract types, if the integration of software components based on them is to be subsequently performed. The impact of these techniques on the reusability of software and the creation of project-specific software support environments is also discussed.
CARES/Life Software for Designing More Reliable Ceramic Parts
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.; Powers, Lynn M.; Baker, Eric H.
1997-01-01
Products made from advanced ceramics show great promise for revolutionizing aerospace and terrestrial propulsion, and power generation. However, ceramic components are difficult to design because brittle materials in general have widely varying strength values. The CAPES/Life software eases this task by providing a tool to optimize the design and manufacture of brittle material components using probabilistic reliability analysis techniques. Probabilistic component design involves predicting the probability of failure for a thermomechanically loaded component from specimen rupture data. Typically, these experiments are performed using many simple geometry flexural or tensile test specimens. A static, dynamic, or cyclic load is applied to each specimen until fracture. Statistical strength and SCG (fatigue) parameters are then determined from these data. Using these parameters and the results obtained from a finite element analysis, the time-dependent reliability for a complex component geometry and loading is then predicted. Appropriate design changes are made until an acceptable probability of failure has been reached.
Universal distribution of component frequencies in biological and technological systems
Pang, Tin Yau; Maslov, Sergei
2013-01-01
Bacterial genomes and large-scale computer software projects both consist of a large number of components (genes or software packages) connected via a network of mutual dependencies. Components can be easily added or removed from individual systems, and their use frequencies vary over many orders of magnitude. We study this frequency distribution in genomes of ∼500 bacterial species and in over 2 million Linux computers and find that in both cases it is described by the same scale-free power-law distribution with an additional peak near the tail of the distribution corresponding to nearly universal components. We argue that the existence of a power law distribution of frequencies of components is a general property of any modular system with a multilayered dependency network. We demonstrate that the frequency of a component is positively correlated with its dependency degree given by the total number of upstream components whose operation directly or indirectly depends on the selected component. The observed frequency/dependency degree distributions are reproduced in a simple mathematically tractable model introduced and analyzed in this study. PMID:23530195
Securing Ground Data System Applications for Space Operations
NASA Technical Reports Server (NTRS)
Pajevski, Michael J.; Tso, Kam S.; Johnson, Bryan
2014-01-01
The increasing prevalence and sophistication of cyber attacks has prompted the Multimission Ground Systems and Services (MGSS) Program Office at Jet Propulsion Laboratory (JPL) to initiate the Common Access Manager (CAM) effort to protect software applications used in Ground Data Systems (GDSs) at JPL and other NASA Centers. The CAM software provides centralized services and software components used by GDS subsystems to meet access control requirements and ensure data integrity, confidentiality, and availability. In this paper we describe the CAM software; examples of its integration with spacecraft commanding software applications and an information management service; and measurements of its performance and reliability.
Cyber Warfare: Protecting Military Systems
2000-01-01
Software is a key component in nearly every critical system used by the Department of Defense. Attacking the software in a system- cyber warfare - is a...revolutionary method of pursuing war. This article describes various cyber warfare approaches and suggests methods to counter them.
The Vendors' Corner: Biblio-Techniques' Library and Information System (BLIS).
ERIC Educational Resources Information Center
Library Software Review, 1984
1984-01-01
Describes online catalog and integrated library computer system designed to enhance Washington Library Network's software. Highlights include system components; implementation options; system features (integrated library functions, database design, system management facilities); support services (installation and training, software maintenance and…
How to Purchase, Set Up, & Safeguard a CD-ROM Network.
ERIC Educational Resources Information Center
Almquist, Arne J.
1996-01-01
Presents an overview of the hardware and software required to network CD-ROMs in schools. Topics include network infrastructures, networking software, file server-based systems, CD-ROM servers, vendors of network components, workstations, network utilities, and network management. (LRW)
Educational Software: Evaluation? No! Utility? Yes!
ERIC Educational Resources Information Center
Hofmann, Rich
1985-01-01
The utility of educational software is associated with three components: (1) type of learning (cognitive development, individually facilitated acquisition of knowledge, teacher-facilitated acquisition of knowledge, memorization); (2) type of experience (self-directed or purposive environmental experience); and (3) user perception of software…
Advanced software development workstation project: Engineering scripting language. Graphical editor
NASA Technical Reports Server (NTRS)
1992-01-01
Software development is widely considered to be a bottleneck in the development of complex systems, both in terms of development and in terms of maintenance of deployed systems. Cost of software development and maintenance can also be very high. One approach to reducing costs and relieving this bottleneck is increasing the reuse of software designs and software components. A method for achieving such reuse is a software parts composition system. Such a system consists of a language for modeling software parts and their interfaces, a catalog of existing parts, an editor for combining parts, and a code generator that takes a specification and generates code for that application in the target language. The Advanced Software Development Workstation is intended to be an expert system shell designed to provide the capabilities of a software part composition system.
Simple solution to the medical instrumentation software problem
NASA Astrophysics Data System (ADS)
Leif, Robert C.; Leif, Suzanne B.; Leif, Stephanie H.; Bingue, E.
1995-04-01
Medical devices now include a substantial software component, which is both difficult and expensive to produce and maintain. Medical software must be developed according to `Good Manufacturing Practices', GMP. Good Manufacturing Practices as specified by the FDA and ISO requires the definition and compliance to a software processes which ensures quality products by specifying a detailed method of software construction. The software process should be based on accepted standards. US Department of Defense software standards and technology can both facilitate the development and improve the quality of medical systems. We describe the advantages of employing Mil-Std-498, Software Development and Documentation, and the Ada programming language. Ada provides the very broad range of functionalities, from embedded real-time to management information systems required by many medical devices. It also includes advanced facilities for object oriented programming and software engineering.
Development of a web application for water resources based on open source software
NASA Astrophysics Data System (ADS)
Delipetrev, Blagoj; Jonoski, Andreja; Solomatine, Dimitri P.
2014-01-01
This article presents research and development of a prototype web application for water resources using latest advancements in Information and Communication Technologies (ICT), open source software and web GIS. The web application has three web services for: (1) managing, presenting and storing of geospatial data, (2) support of water resources modeling and (3) water resources optimization. The web application is developed using several programming languages (PhP, Ajax, JavaScript, Java), libraries (OpenLayers, JQuery) and open source software components (GeoServer, PostgreSQL, PostGIS). The presented web application has several main advantages: it is available all the time, it is accessible from everywhere, it creates a real time multi-user collaboration platform, the programing languages code and components are interoperable and designed to work in a distributed computer environment, it is flexible for adding additional components and services and, it is scalable depending on the workload. The application was successfully tested on a case study with concurrent multi-users access.
The regulatory software of cellular metabolism.
Segrè, Daniel
2004-06-01
Understanding the regulation of metabolic pathways in the cell is like unraveling the 'software' that is running on the 'hardware' of the metabolic network. Transcriptional regulation of enzymes is an important component of this software. A recent systematic analysis of metabolic gene-expression data in Saccharomyces cerevisiae reveals a complex modular organization of co-expressed genes, which could increase our ability to understand and engineer cellular metabolic functions.
Program Helps Standardize Documentation Of Software
NASA Technical Reports Server (NTRS)
Howe, G.
1994-01-01
Intelligent Documentation Management System, IDMS, computer program developed to assist project managers in implementing information system documentation standard known as NASA-STD-2100-91, NASA STD, COS-10300, of NASA's Software Management and Assurance Program. Standard consists of data-item descriptions or templates, each of which governs particular component of software documentation. IDMS helps program manager in tailoring documentation standard to project. Written in C language.
The SoRReL papers: Recent publications of the Software Reuse Repository Lab
NASA Technical Reports Server (NTRS)
Eichmann, David A. (Editor)
1992-01-01
The entire publication is presented of some of the papers recently published by the SoRReL. Some typical titles are as follows: Design of a Lattice-Based Faceted Classification System; A Hybrid Approach to Software Reuse Repository Retrieval; Selecting Reusable Components Using Algebraic Specifications; Neural Network-Based Retrieval from Reuse Repositories; and A Neural Net-Based Approach to Software Metrics.
Coordination in Large Scale Software Development
1990-01-01
toward achieving common and explicitly recognized goals" (Blau and Scott, 1962) and "the integration or linking together of different parts of an...require a strong degree of integration of its components. Much software is built of thousands of modules that must mesh with each other perfectly for the...coordination between subgroups producing software modules could lead to failure in integrating the modules themselves. Informal communication. Both
Modular Software for Spacecraft Navigation Using the Global Positioning System (GPS)
NASA Technical Reports Server (NTRS)
Truong, S. H.; Hartman, K. R.; Weidow, D. A.; Berry, D. L.; Oza, D. H.; Long, A. C.; Joyce, E.; Steger, W. L.
1996-01-01
The Goddard Space Flight Center Flight Dynamics and Mission Operations Divisions have jointly investigated the feasibility of engineering modular Global Positioning SYSTEM (GPS) navigation software to support both real time flight and ground postprocessing configurations. The goals of this effort are to define standard GPS data interfaces and to engineer standard, reusable navigation software components that can be used to build a broad range of GPS navigation support applications. The paper discusses the GPS modular software (GMOD) system and operations concepts, major requirements, candidate software architecture, feasibility assessment and recommended software interface standards. In additon, ongoing efforts to broaden the scope of the initial study and to develop modular software to support autonomous navigation using GPS are addressed,
Designing Tracking Software for Image-Guided Surgery Applications: IGSTK Experience
Enquobahrie, Andinet; Gobbi, David; Turek, Matt; Cheng, Patrick; Yaniv, Ziv; Lindseth, Frank; Cleary, Kevin
2009-01-01
Objective Many image-guided surgery applications require tracking devices as part of their core functionality. The Image-Guided Surgery Toolkit (IGSTK) was designed and developed to interface tracking devices with software applications incorporating medical images. Methods IGSTK was designed as an open source C++ library that provides the basic components needed for fast prototyping and development of image-guided surgery applications. This library follows a component-based architecture with several components designed for specific sets of image-guided surgery functions. At the core of the toolkit is the tracker component that handles communication between a control computer and navigation device to gather pose measurements of surgical instruments present in the surgical scene. The representations of the tracked instruments are superimposed on anatomical images to provide visual feedback to the clinician during surgical procedures. Results The initial version of the IGSTK toolkit has been released in the public domain and several trackers are supported. The toolkit and related information are available at www.igstk.org. Conclusion With the increased popularity of minimally invasive procedures in health care, several tracking devices have been developed for medical applications. Designing and implementing high-quality and safe software to handle these different types of trackers in a common framework is a challenging task. It requires establishing key software design principles that emphasize abstraction, extensibility, reusability, fault-tolerance, and portability. IGSTK is an open source library that satisfies these needs for the image-guided surgery community. PMID:20037671
NASA Tech Briefs, August 1997. Volume 21, No. 8
NASA Technical Reports Server (NTRS)
1997-01-01
Topics:Graphics and Simulation; Mechanical Components; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Software; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences; Books and Reports.
NASA Technical Reports Server (NTRS)
Grubb, Matt
2016-01-01
The NASA Operational Simulator for Small Satellites (NOS3) is a suite of tools to aid in areas such as software development, integration test (IT), mission operations training, verification and validation (VV), and software systems check-out. NOS3 provides a software development environment, a multi-target build system, an operator interface-ground station, dynamics and environment simulations, and software-based hardware models. NOS3 enables the development of flight software (FSW) early in the project life cycle, when access to hardware is typically not available. For small satellites there are extensive lead times on many of the commercial-off-the-shelf (COTS) components as well as limited funding for engineering test units (ETU). Considering the difficulty of providing a hardware test-bed to each developer tester, hardware models are modeled based upon characteristic data or manufacturers data sheets for each individual component. The fidelity of each hardware models is such that FSW executes unaware that physical hardware is not present. This allows binaries to be compiled for both the simulation environment, and the flight computer, without changing the FSW source code. For hardware models that provide data dependent on the environment, such as a GPS receiver or magnetometer, an open-source tool from NASA GSFC (42 Spacecraft Simulation) is used to provide the necessary data. The underlying infrastructure used to transfer messages between FSW and the hardware models can also be used to monitor, intercept, and inject messages, which has proven to be beneficial for VV of larger missions such as James Webb Space Telescope (JWST). As hardware is procured, drivers can be added to the environment to enable hardware-in-the-loop (HWIL) testing. When strict time synchronization is not vital, any number of combinations of hardware components and software-based models can be tested. The open-source operator interface used in NOS3 is COSMOS from Ball Aerospace. For testing, plug-ins are implemented in COSMOS to control the NOS3 simulations, while the command and telemetry tools available in COSMOS are used to communicate with FSW. NOS3 is actively being used for FSW development and component testing of the Simulation-to-Flight 1 (STF-1) CubeSat. As NOS3 matures, hardware models have been added for common CubeSat components such as Novatel GPS receivers, ClydeSpace electrical power systems and batteries, ISISpace antenna systems, etc. In the future, NASA IVV plans to distribute NOS3 to other CubeSat developers and release the suite to the open-source community.
Electronic Health Record for Intensive Care based on Usual Windows Based Software.
Reper, Arnaud; Reper, Pascal
2015-08-01
In Intensive Care Units, the amount of data to be processed for patients care, the turn over of the patients, the necessity for reliability and for review processes indicate the use of Patient Data Management Systems (PDMS) and electronic health records (EHR). To respond to the needs of an Intensive Care Unit and not to be locked with proprietary software, we developed an EHR based on usual software and components. The software was designed as a client-server architecture running on the Windows operating system and powered by the access data base system. The client software was developed using Visual Basic interface library. The application offers to the users the following functions: medical notes captures, observations and treatments, nursing charts with administration of medications, scoring systems for classification, and possibilities to encode medical activities for billing processes. Since his deployment in September 2004, the EHR was used to care more than five thousands patients with the expected software reliability and facilitated data management and review processes. Communications with other medical software were not developed from the start, and are realized by the use of basic functionalities communication engine. Further upgrade of the system will include multi-platform support, use of typed language with static analysis, and configurable interface. The developed system based on usual software components was able to respond to the medical needs of the local ICU environment. The use of Windows for development allowed us to customize the software to the preexisting organization and contributed to the acceptability of the whole system.
DOT National Transportation Integrated Search
2013-05-01
This document describes the Software Architecture Design and Implementation Options for FRATIS system. The demonstration component of this task will serve to test the technical feasibility of the FRATIS prototype while also facilitating the collectio...
Accounting for Uncertainties in Strengths of SiC MEMS Parts
NASA Technical Reports Server (NTRS)
Nemeth, Noel; Evans, Laura; Beheim, Glen; Trapp, Mark; Jadaan, Osama; Sharpe, William N., Jr.
2007-01-01
A methodology has been devised for accounting for uncertainties in the strengths of silicon carbide structural components of microelectromechanical systems (MEMS). The methodology enables prediction of the probabilistic strengths of complexly shaped MEMS parts using data from tests of simple specimens. This methodology is intended to serve as a part of a rational basis for designing SiC MEMS, supplementing methodologies that have been borrowed from the art of designing macroscopic brittle material structures. The need for this or a similar methodology arises as a consequence of the fundamental nature of MEMS and the brittle silicon-based materials of which they are typically fabricated. When tested to fracture, MEMS and structural components thereof show wide part-to-part scatter in strength. The methodology involves the use of the Ceramics Analysis and Reliability Evaluation of Structures Life (CARES/Life) software in conjunction with the ANSYS Probabilistic Design System (PDS) software to simulate or predict the strength responses of brittle material components while simultaneously accounting for the effects of variability of geometrical features on the strength responses. As such, the methodology involves the use of an extended version of the ANSYS/CARES/PDS software system described in Probabilistic Prediction of Lifetimes of Ceramic Parts (LEW-17682-1/4-1), Software Tech Briefs supplement to NASA Tech Briefs, Vol. 30, No. 9 (September 2006), page 10. The ANSYS PDS software enables the ANSYS finite-element-analysis program to account for uncertainty in the design-and analysis process. The ANSYS PDS software accounts for uncertainty in material properties, dimensions, and loading by assigning probabilistic distributions to user-specified model parameters and performing simulations using various sampling techniques.
Automated reuseable components system study results
NASA Technical Reports Server (NTRS)
Gilroy, Kathy
1989-01-01
The Automated Reusable Components System (ARCS) was developed under a Phase 1 Small Business Innovative Research (SBIR) contract for the U.S. Army CECOM. The objectives of the ARCS program were: (1) to investigate issues associated with automated reuse of software components, identify alternative approaches, and select promising technologies, and (2) to develop tools that support component classification and retrieval. The approach followed was to research emerging techniques and experimental applications associated with reusable software libraries, to investigate the more mature information retrieval technologies for applicability, and to investigate the applicability of specialized technologies to improve the effectiveness of a reusable component library. Various classification schemes and retrieval techniques were identified and evaluated for potential application in an automated library system for reusable components. Strategies for library organization and management, component submittal and storage, and component search and retrieval were developed. A prototype ARCS was built to demonstrate the feasibility of automating the reuse process. The prototype was created using a subset of the classification and retrieval techniques that were investigated. The demonstration system was exercised and evaluated using reusable Ada components selected from the public domain. A requirements specification for a production-quality ARCS was also developed.
Technique for Early Reliability Prediction of Software Components Using Behaviour Models
Ali, Awad; N. A. Jawawi, Dayang; Adham Isa, Mohd; Imran Babar, Muhammad
2016-01-01
Behaviour models are the most commonly used input for predicting the reliability of a software system at the early design stage. A component behaviour model reveals the structure and behaviour of the component during the execution of system-level functionalities. There are various challenges related to component reliability prediction at the early design stage based on behaviour models. For example, most of the current reliability techniques do not provide fine-grained sequential behaviour models of individual components and fail to consider the loop entry and exit points in the reliability computation. Moreover, some of the current techniques do not tackle the problem of operational data unavailability and the lack of analysis results that can be valuable for software architects at the early design stage. This paper proposes a reliability prediction technique that, pragmatically, synthesizes system behaviour in the form of a state machine, given a set of scenarios and corresponding constraints as input. The state machine is utilized as a base for generating the component-relevant operational data. The state machine is also used as a source for identifying the nodes and edges of a component probabilistic dependency graph (CPDG). Based on the CPDG, a stack-based algorithm is used to compute the reliability. The proposed technique is evaluated by a comparison with existing techniques and the application of sensitivity analysis to a robotic wheelchair system as a case study. The results indicate that the proposed technique is more relevant at the early design stage compared to existing works, and can provide a more realistic and meaningful prediction. PMID:27668748
2006-11-01
engines will involve a family of common components. It will consist of a real - time operating system and partitioned application software (AS...system will employ a standard hardware and software architecture. It will consist of a real time operating system and partitioned application...Inputs - Enables Large Cost Reduction 3. Software - FAA Certified Auto Code - Real Time Operating System - Commercial
Data and Analysis Center for Software.
1980-06-01
can make use of it in their day- to -day activities of developing, maintaining, and managing software. The biblio- graphic collection is composed of...which refer to development, design, or programming approaches whicn view a software system component, or module in terms of its required or intended... practices " are also included In this group. PROCEDURES (I keyword) Procedures is a term used ambiguously in the literature to refer to functions
NASA Tech Briefs, October 1997. Volume 21, No. 10
NASA Technical Reports Server (NTRS)
1997-01-01
Topics covered include: Sensors/Imaging; Mechanical Components; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Software; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences; Life Sciences; Books and Reports.
Color image watermarking against fog effects
NASA Astrophysics Data System (ADS)
Chotikawanid, Piyanart; Amornraksa, Thumrongrat
2017-07-01
Fog effects in various computer and camera software can partially or fully damage the watermark information within the watermarked image. In this paper, we propose a color image watermarking based on the modification of reflectance component against fog effects. The reflectance component is extracted from the blue color channel in the RGB color space of a host image, and then used to carry a watermark signal. The watermark extraction is blindly achieved by subtracting the estimation of the original reflectance component from the watermarked component. The performance of the proposed watermarking method in terms of wPSNR and NC is evaluated, and then compared with the previous method. The experimental results on robustness against various levels of fog effect, from both computer software and mobile application, demonstrated a higher robustness of our proposed method, compared to the previous one.
NASA Technical Reports Server (NTRS)
1997-01-01
Products made from advanced ceramics show great promise for revolutionizing aerospace and terrestrial propulsion and power generation. However, ceramic components are difficult to design because brittle materials in general have widely varying strength values. The CARES/Life software developed at the NASA Lewis Research Center eases this by providing a tool that uses probabilistic reliability analysis techniques to optimize the design and manufacture of brittle material components. CARES/Life is an integrated package that predicts the probability of a monolithic ceramic component's failure as a function of its time in service. It couples commercial finite element programs--which resolve a component's temperature and stress distribution - with reliability evaluation and fracture mechanics routines for modeling strength - limiting defects. These routines are based on calculations of the probabilistic nature of the brittle material's strength.
RT-Syn: A real-time software system generator
NASA Technical Reports Server (NTRS)
Setliff, Dorothy E.
1992-01-01
This paper presents research into providing highly reusable and maintainable components by using automatic software synthesis techniques. This proposal uses domain knowledge combined with automatic software synthesis techniques to engineer large-scale mission-critical real-time software. The hypothesis centers on a software synthesis architecture that specifically incorporates application-specific (in this case real-time) knowledge. This architecture synthesizes complex system software to meet a behavioral specification and external interaction design constraints. Some examples of these external constraints are communication protocols, precisions, timing, and space limitations. The incorporation of application-specific knowledge facilitates the generation of mathematical software metrics which are used to narrow the design space, thereby making software synthesis tractable. Success has the potential to dramatically reduce mission-critical system life-cycle costs not only by reducing development time, but more importantly facilitating maintenance, modifications, and extensions of complex mission-critical software systems, which are currently dominating life cycle costs.
Impeller leakage flow modeling for mechanical vibration control
NASA Technical Reports Server (NTRS)
Palazzolo, Alan B.
1996-01-01
HPOTP and HPFTP vibration test results have exhibited transient and steady characteristics which may be due to impeller leakage path (ILP) related forces. For example, an axial shift in the rotor could suddenly change the ILP clearances and lengths yielding dynamic coefficient and subsequent vibration changes. ILP models are more complicated than conventional-single component-annular seal models due to their radial flow component (coriolis and centrifugal acceleration), complex geometry (axial/radial clearance coupling), internal boundary (transition) flow conditions between mechanical components along the ILP and longer length, requiring moment as well as force coefficients. Flow coupling between mechanical components results from mass and energy conservation applied at their interfaces. Typical components along the ILP include an inlet seal, curved shroud, and an exit seal, which may be a stepped labyrinth type. Von Pragenau (MSFC) has modeled labyrinth seals as a series of plain annular seals for leakage and dynamic coefficient prediction. These multi-tooth components increase the total number of 'flow coupled' components in the ILP. Childs developed an analysis for an ILP consisting of a single, constant clearance shroud with an exit seal represented by a lumped flow-loss coefficient. This same geometry was later extended to include compressible flow. The objective of the current work is to: supply ILP leakage-force impedance-dynamic coefficient modeling software to MSFC engineers, base on incompressible/compressible bulk flow theory; design the software to model a generic geometry ILP described by a series of components lying along an arbitrarily directed path; validate the software by comparison to available test data, CFD and bulk models; and develop a hybrid CFD-bulk flow model of an ILP to improve modeling accuracy within practical run time constraints.
ERIC Educational Resources Information Center
Srinivasan, Srilekha; Perez, Lance C.; Palmer, Robert D.; Brooks, David W.; Wilson, Kathleen; Fowler, David
2006-01-01
A systematic study of the implementation of simulation hardware (TIMS) replacing software (MATLAB) was undertaken for advanced undergraduate and early graduate courses in electrical engineering. One outcome of the qualitative component of the study was remarkable: most students interviewed (4/4 and 6/9) perceived the software simulations as…
Programming Language CAMIL II: Implementation and Evaluation.
ERIC Educational Resources Information Center
Gardner, Edward
A reimplementation of Computer assisted/managed instruction language (CAMIL) for qualitative and quantitative improvements in the software is presented. The reformatted language is described narratively, and major components of the system software are indicated and discussed. Authoring aids and imbedded support facilities are also described, and…
2002-06-01
techniques for addressing the software component retrieval problem. Steigerwald [Ste91] introduced the use of algebraic specifications for defining the...provided in terms of a specification written using Luqi’s Prototype Specification Description Language (PSDL) [LBY88] augmented with an algebraic
An efficient approach to the deployment of complex open source information systems
Cong, Truong Van Chi; Groeneveld, Eildert
2011-01-01
Complex open source information systems are usually implemented as component-based software to inherit the available functionality of existing software packages developed by third parties. Consequently, the deployment of these systems not only requires the installation of operating system, application framework and the configuration of services but also needs to resolve the dependencies among components. The problem becomes more challenging when the application must be installed and used on different platforms such as Linux and Windows. To address this, an efficient approach using the virtualization technology is suggested and discussed in this paper. The approach has been applied in our project to deploy a web-based integrated information system in molecular genetics labs. It is a low-cost solution to benefit both software developers and end-users. PMID:22102770
The Development of Point Doppler Velocimeter Data Acquisition and Processing Software
NASA Technical Reports Server (NTRS)
Cavone, Angelo A.
2008-01-01
In order to develop efficient and quiet aircraft and validate Computational Fluid Dynamic predications, aerodynamic researchers require flow parameter measurements to characterize flow fields about wind tunnel models and jet flows. A one-component Point Doppler Velocimeter (pDv), a non-intrusive, laser-based instrument, was constructed using a design/develop/test/validate/deploy approach. A primary component of the instrument is software required for system control/management and data collection/reduction. This software along with evaluation algorithms, advanced pDv from a laboratory curiosity to a production level instrument. Simultaneous pDv and pitot probe velocity measurements obtained at the centerline of a flow exiting a two-inch jet, matched within 0.4%. Flow turbulence spectra obtained with pDv and a hot-wire detected the primary and secondary harmonics with equal dynamic range produced by the fan driving the flow. Novel,hardware and software methods were developed, tested and incorporated into the system to eliminate and/or minimize error sources and improve system reliability.
ENCOMPASS: A SAGA based environment for the compositon of programs and specifications, appendix A
NASA Technical Reports Server (NTRS)
Terwilliger, Robert B.; Campbell, Roy H.
1985-01-01
ENCOMPASS is an example integrated software engineering environment being constructed by the SAGA project. ENCOMPASS supports the specification, design, construction and maintenance of efficient, validated, and verified programs in a modular programming language. The life cycle paradigm, schema of software configurations, and hierarchical library structure used by ENCOMPASS is presented. In ENCOMPASS, the software life cycle is viewed as a sequence of developments, each of which reuses components from the previous ones. Each development proceeds through the phases planning, requirements definition, validation, design, implementation, and system integration. The components in a software system are modeled as entities which have relationships between them. An entity may have different versions and different views of the same project are allowed. The simple entities supported by ENCOMPASS may be combined into modules which may be collected into projects. ENCOMPASS supports multiple programmers and projects using a hierarchical library system containing a workspace for each programmer; a project library for each project, and a global library common to all projects.
Generalized Support Software: Domain Analysis and Implementation
NASA Technical Reports Server (NTRS)
Stark, Mike; Seidewitz, Ed
1995-01-01
For the past five years, the Flight Dynamics Division (FDD) at NASA's Goddard Space Flight Center has been carrying out a detailed domain analysis effort and is now beginning to implement Generalized Support Software (GSS) based on this analysis. GSS is part of the larger Flight Dynamics Distributed System (FDDS), and is designed to run under the FDDS User Interface / Executive (UIX). The FDD is transitioning from a mainframe based environment to systems running on engineering workstations. The GSS will be a library of highly reusable components that may be configured within the standard FDDS architecture to quickly produce low-cost satellite ground support systems. The estimates for the first release is that this library will contain approximately 200,000 lines of code. The main driver for developing generalized software is development cost and schedule improvement. The goal is to ultimately have at least 80 percent of all software required for a spacecraft mission (within the domain supported by the GSS) to be configured from the generalized components.
Das, Narendra; Stampoulis, Dimitrios; Ines, Amor; Fisher, Joshua B.; Granger, Stephanie; Kawata, Jessie; Han, Eunjin; Behrangi, Ali
2017-01-01
The Regional Hydrologic Extremes Assessment System (RHEAS) is a prototype software framework for hydrologic modeling and data assimilation that automates the deployment of water resources nowcasting and forecasting applications. A spatially-enabled database is a key component of the software that can ingest a suite of satellite and model datasets while facilitating the interfacing with Geographic Information System (GIS) applications. The datasets ingested are obtained from numerous space-borne sensors and represent multiple components of the water cycle. The object-oriented design of the software allows for modularity and extensibility, showcased here with the coupling of the core hydrologic model with a crop growth model. RHEAS can exploit multi-threading to scale with increasing number of processors, while the database allows delivery of data products and associated uncertainty through a variety of GIS platforms. A set of three example implementations of RHEAS in the United States and Kenya are described to demonstrate the different features of the system in real-world applications. PMID:28545077
Andreadis, Konstantinos M; Das, Narendra; Stampoulis, Dimitrios; Ines, Amor; Fisher, Joshua B; Granger, Stephanie; Kawata, Jessie; Han, Eunjin; Behrangi, Ali
2017-01-01
The Regional Hydrologic Extremes Assessment System (RHEAS) is a prototype software framework for hydrologic modeling and data assimilation that automates the deployment of water resources nowcasting and forecasting applications. A spatially-enabled database is a key component of the software that can ingest a suite of satellite and model datasets while facilitating the interfacing with Geographic Information System (GIS) applications. The datasets ingested are obtained from numerous space-borne sensors and represent multiple components of the water cycle. The object-oriented design of the software allows for modularity and extensibility, showcased here with the coupling of the core hydrologic model with a crop growth model. RHEAS can exploit multi-threading to scale with increasing number of processors, while the database allows delivery of data products and associated uncertainty through a variety of GIS platforms. A set of three example implementations of RHEAS in the United States and Kenya are described to demonstrate the different features of the system in real-world applications.
Symbolic Constraint Maintenance Grid
NASA Technical Reports Server (NTRS)
James, Mark
2006-01-01
Version 3.1 of Symbolic Constraint Maintenance Grid (SCMG) is a software system that provides a general conceptual framework for utilizing pre-existing programming techniques to perform symbolic transformations of data. SCMG also provides a language (and an associated communication method and protocol) for representing constraints on the original non-symbolic data. SCMG provides a facility for exchanging information between numeric and symbolic components without knowing the details of the components themselves. In essence, it integrates symbolic software tools (for diagnosis, prognosis, and planning) with non-artificial-intelligence software. SCMG executes a process of symbolic summarization and monitoring of continuous time series data that are being abstractly represented as symbolic templates of information exchange. This summarization process enables such symbolic- reasoning computing systems as artificial- intelligence planning systems to evaluate the significance and effects of channels of data more efficiently than would otherwise be possible. As a result of the increased efficiency in representation, reasoning software can monitor more channels and is thus able to perform monitoring and control functions more effectively.
2015-05-01
application ,1 while the simulated PLC software is the open source ModbusPal Java application . When queried using the Modbus TCP protocol, ModbusPal reports...and programmable logic controller ( PLC ) components. The HMI and PLC components were instantiated with software and installed in multiple virtual...creating and capturing HMI– PLC network traffic over a 24-h period in the virtualized network and inspect the packets for errors. Test the
NASA Technical Reports Server (NTRS)
Callender, E. David; Steinbacher, Jody
1989-01-01
This is the third of five volumes on Information System Life-Cycle and Documentation Standards which present a well organized, easily used standard for providing technical information needed for developing information systems, components, and related processes. This volume states the Software Management and Assurance Program documentation standard for a product specification document and for data item descriptions. The framework can be applied to any NASA information system, software, hardware, operational procedures components, and related processes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Laros, James H.; Grant, Ryan; Levenhagen, Michael J.
Measuring and controlling the power and energy consumption of high performance computing systems by various components in the software stack is an active research area. Implementations in lower level software layers are beginning to emerge in some production systems, which is very welcome. To be most effective, a portable interface to measurement and control features would significantly facilitate participation by all levels of the software stack. We present a proposal for a standard power Application Programming Interface (API) that endeavors to cover the entire software space, from generic hardware interfaces to the input from the computer facility manager.
CARDS: A blueprint and environment for domain-specific software reuse
NASA Technical Reports Server (NTRS)
Wallnau, Kurt C.; Solderitsch, Anne Costa; Smotherman, Catherine
1992-01-01
CARDS (Central Archive for Reusable Defense Software) exploits advances in domain analysis and domain modeling to identify, specify, develop, archive, retrieve, understand, and reuse domain-specific software components. An important element of CARDS is to provide visibility into the domain model artifacts produced by, and services provided by, commercial computer-aided software engineering (CASE) technology. The use of commercial CASE technology is important to provide rich, robust support for the varied roles involved in a reuse process. We refer to this kind of use of knowledge representation systems as supporting 'knowledge-based integration.'
Preparing a scientific manuscript in Linux: Today's possibilities and limitations.
Tchantchaleishvili, Vakhtang; Schmitto, Jan D
2011-10-22
Increasing number of scientists are enthusiastic about using free, open source software for their research purposes. Authors' specific goal was to examine whether a Linux-based operating system with open source software packages would allow to prepare a submission-ready scientific manuscript without the need to use the proprietary software. Preparation and editing of scientific manuscripts is possible using Linux and open source software. This letter to the editor describes key steps for preparation of a publication-ready scientific manuscript in a Linux-based operating system, as well as discusses the necessary software components. This manuscript was created using Linux and open source programs for Linux.
NASA Astrophysics Data System (ADS)
Brambilla, Marco; Ceri, Stefano; Valle, Emanuele Della; Facca, Federico M.; Tziviskou, Christina
Although Semantic Web Services are expected to produce a revolution in the development of Web-based systems, very few enterprise-wide design experiences are available; one of the main reasons is the lack of sound Software Engineering methods and tools for the deployment of Semantic Web applications. In this chapter, we present an approach to software development for the Semantic Web based on classical Software Engineering methods (i.e., formal business process development, computer-aided and component-based software design, and automatic code generation) and on semantic methods and tools (i.e., ontology engineering, semantic service annotation and discovery).
Practical Issues in Implementing Software Reliability Measurement
NASA Technical Reports Server (NTRS)
Nikora, Allen P.; Schneidewind, Norman F.; Everett, William W.; Munson, John C.; Vouk, Mladen A.; Musa, John D.
1999-01-01
Many ways of estimating software systems' reliability, or reliability-related quantities, have been developed over the past several years. Of particular interest are methods that can be used to estimate a software system's fault content prior to test, or to discriminate between components that are fault-prone and those that are not. The results of these methods can be used to: 1) More accurately focus scarce fault identification resources on those portions of a software system most in need of it. 2) Estimate and forecast the risk of exposure to residual faults in a software system during operation, and develop risk and safety criteria to guide the release of a software system to fielded use. 3) Estimate the efficiency of test suites in detecting residual faults. 4) Estimate the stability of the software maintenance process.
Autonomous robot software development using simple software components
NASA Astrophysics Data System (ADS)
Burke, Thomas M.; Chung, Chan-Jin
2004-10-01
Developing software to control a sophisticated lane-following, obstacle-avoiding, autonomous robot can be demanding and beyond the capabilities of novice programmers - but it doesn"t have to be. A creative software design utilizing only basic image processing and a little algebra, has been employed to control the LTU-AISSIG autonomous robot - a contestant in the 2004 Intelligent Ground Vehicle Competition (IGVC). This paper presents a software design equivalent to that used during the IGVC, but with much of the complexity removed. The result is an autonomous robot software design, that is robust, reliable, and can be implemented by programmers with a limited understanding of image processing. This design provides a solid basis for further work in autonomous robot software, as well as an interesting and achievable robotics project for students.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-09
... With Image Processing Systems, Components Thereof, and Associated Software; Notice of Commission... importation of certain electronic devices with image processing systems, components thereof, and associated... direct infringement is asserted and the accused article does not meet every limitation of the asserted...
The Benefits of Multimedia Computer Software for Students with Disabilities.
ERIC Educational Resources Information Center
Green, Douglas W.
This paper assesses the current state of research and informed opinion on the benefits of multimedia computer software for students with disabilities. Topics include: a definition of multimedia; advantages of multimedia; Multiple Intelligence Theory which states intellectual abilities consist of seven components; motivation and behavior…
Coupled dam safety analysis using WinDAM
USDA-ARS?s Scientific Manuscript database
Windows® Dam Analysis Modules (WinDAM) is a set of modular software components that can be used to analyze overtopping and internal erosion of embankment dams. Dakota is an extensive software framework for design exploration and simulation. These tools can be coupled to create a powerful framework...
From Workstation to Teacher Support System: A Tool to Increase Productivity.
ERIC Educational Resources Information Center
Chen, J. Wey
1989-01-01
Describes a teacher support system which is a computer-based workstation that provides support for teachers and administrators by integrating teacher utility programs, instructional management software, administrative packages, and office automation tools. Hardware is described and software components are explained, including database managers,…
Observing System Simulation Experiment (OSSE) for the HyspIRI Spectrometer Mission
NASA Technical Reports Server (NTRS)
Turmon, Michael J.; Block, Gary L.; Green, Robert O.; Hua, Hook; Jacob, Joseph C.; Sobel, Harold R.; Springer, Paul L.; Zhang, Qingyuan
2010-01-01
The OSSE software provides an integrated end-to-end environment to simulate an Earth observing system by iteratively running a distributed modeling workflow based on the HyspIRI Mission, including atmospheric radiative transfer, surface albedo effects, detection, and retrieval for agile exploration of the mission design space. The software enables an Observing System Simulation Experiment (OSSE) and can be used for design trade space exploration of science return for proposed instruments by modeling the whole ground truth, sensing, and retrieval chain and to assess retrieval accuracy for a particular instrument and algorithm design. The OSSE in fra struc ture is extensible to future National Research Council (NRC) Decadal Survey concept missions where integrated modeling can improve the fidelity of coupled science and engineering analyses for systematic analysis and science return studies. This software has a distributed architecture that gives it a distinct advantage over other similar efforts. The workflow modeling components are typically legacy computer programs implemented in a variety of programming languages, including MATLAB, Excel, and FORTRAN. Integration of these diverse components is difficult and time-consuming. In order to hide this complexity, each modeling component is wrapped as a Web Service, and each component is able to pass analysis parameterizations, such as reflectance or radiance spectra, on to the next component downstream in the service workflow chain. In this way, the interface to each modeling component becomes uniform and the entire end-to-end workflow can be run using any existing or custom workflow processing engine. The architecture lets users extend workflows as new modeling components become available, chain together the components using any existing or custom workflow processing engine, and distribute them across any Internet-accessible Web Service endpoints. The workflow components can be hosted on any Internet-accessible machine. This has the advantages that the computations can be distributed to make best use of the available computing resources, and each workflow component can be hosted and maintained by their respective domain experts.
An integrated software suite for surface-based analyses of cerebral cortex.
Van Essen, D C; Drury, H A; Dickson, J; Harwell, J; Hanlon, D; Anderson, C H
2001-01-01
The authors describe and illustrate an integrated trio of software programs for carrying out surface-based analyses of cerebral cortex. The first component of this trio, SureFit (Surface Reconstruction by Filtering and Intensity Transformations), is used primarily for cortical segmentation, volume visualization, surface generation, and the mapping of functional neuroimaging data onto surfaces. The second component, Caret (Computerized Anatomical Reconstruction and Editing Tool Kit), provides a wide range of surface visualization and analysis options as well as capabilities for surface flattening, surface-based deformation, and other surface manipulations. The third component, SuMS (Surface Management System), is a database and associated user interface for surface-related data. It provides for efficient insertion, searching, and extraction of surface and volume data from the database.
Combining virtual reality and multimedia techniques for effective maintenance training
NASA Astrophysics Data System (ADS)
McLin, David M.; Chung, James C.
1996-02-01
This paper describes a virtual reality (VR) system developed for use as part of an integrated, low-cost, stand-alone, multimedia trainer. The trainer is used to train National Guard personnel in maintenance and trouble-shooting tasks for the M1A1 Abrams tank, the M2A2 Bradley fighting vehicle and the TOW II missile system. The VR system features a modular, extensible, object-oriented design which consists of a training monitor component, a VR run time component, a model loader component, and a set of domain-specific object behaviors which mimic the behavior of objects encountered in the actual vehicles. The VR system is built from a combination of off-the-shelf commercial software and custom software developed at RTI.
An integrated software suite for surface-based analyses of cerebral cortex
NASA Technical Reports Server (NTRS)
Van Essen, D. C.; Drury, H. A.; Dickson, J.; Harwell, J.; Hanlon, D.; Anderson, C. H.
2001-01-01
The authors describe and illustrate an integrated trio of software programs for carrying out surface-based analyses of cerebral cortex. The first component of this trio, SureFit (Surface Reconstruction by Filtering and Intensity Transformations), is used primarily for cortical segmentation, volume visualization, surface generation, and the mapping of functional neuroimaging data onto surfaces. The second component, Caret (Computerized Anatomical Reconstruction and Editing Tool Kit), provides a wide range of surface visualization and analysis options as well as capabilities for surface flattening, surface-based deformation, and other surface manipulations. The third component, SuMS (Surface Management System), is a database and associated user interface for surface-related data. It provides for efficient insertion, searching, and extraction of surface and volume data from the database.
NASA Astrophysics Data System (ADS)
Silva, N.; Esper, A.
2012-01-01
The work presented in this article represents the results of applying RAMS analysis to a critical space control system, both at system and software levels. The system level RAMS analysis allowed the assignment of criticalities to the high level components, which was further refined by a tailored software level RAMS analysis. The importance of the software level RAMS analysis in the identification of new failure modes and its impact on the system level RAMS analysis is discussed. Recommendations of changes in the software architecture have also been proposed in order to reduce the criticality of the SW components to an acceptable minimum. The dependability analysis was performed in accordance to ECSS-Q-ST-80, which had to be tailored and complemented in some aspects. This tailoring will also be detailed in the article and lessons learned from the application of this tailoring will be shared, stating the importance to space systems safety evaluations. The paper presents the applied techniques, the relevant results obtained, the effort required for performing the tasks and the planned strategy for ROI estimation, as well as the soft skills required and acquired during these activities.
Oxygen Generation System Laptop Bus Controller Flight Software
NASA Technical Reports Server (NTRS)
Rowe, Chad; Panter, Donna
2009-01-01
The Oxygen Generation System Laptop Bus Controller Flight Software was developed to allow the International Space Station (ISS) program to activate specific components of the Oxygen Generation System (OGS) to perform a checkout of key hardware operation in a microgravity environment, as well as to perform preventative maintenance operations of system valves during a long period of what would otherwise be hardware dormancy. The software provides direct connectivity to the OGS Firmware Controller with pre-programmed tasks operated by on-orbit astronauts to exercise OGS valves and motors. The software is used to manipulate the pump, separator, and valves to alleviate the concerns of hardware problems due to long-term inactivity and to allow for operational verification of microgravity-sensitive components early enough so that, if problems are found, they can be addressed before the hardware is required for operation on-orbit. The decision was made to use existing on-orbit IBM ThinkPad A31p laptops and MIL-STD-1553B interface cards as the hardware configuration. The software at the time of this reporting was developed and tested for use under the Windows 2000 Professional operating system to ensure compatibility with the existing on-orbit computer systems.
2012-03-13
Legacy Maintenance and Brownfield Development 6.6.6 Agile and Kanban Development 6.6.7 Putting It All Together at the Large-Project or Enterprise Level...NDI)-intensive systems Ultrahigh software system assurance; Legacy maintenance and brownfield development; and Agile and kanban development. This...be furnished by NDI components or may need to be developed for special systems. Legacy Maintenance and Brownfield Development Fewer and fewer software
Debugging and Logging Services for Defence Service Oriented Architectures
2012-02-01
Service A software component and callable end point that provides a logically related set of operations, each of which perform a logical step in a...important to note that in some cases when the fault is identified to lie in uneditable code such as program libraries, or outsourced software services ...debugging is limited to characterisation of the fault, reporting it to the software or service provider and development of work-arounds and management
Compositional Specification of Software Architecture
NASA Technical Reports Server (NTRS)
Penix, John; Lau, Sonie (Technical Monitor)
1998-01-01
This paper describes our experience using parameterized algebraic specifications to model properties of software architectures. The goal is to model the decomposition of requirements independent of the style used to implement the architecture. We begin by providing an overview of the role of architecture specification in software development. We then describe how architecture specifications are build up from component and connector specifications and give an overview of insights gained from a case study used to validate the method.
Peridigm summary report : lessons learned in development with agile components.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Salinger, Andrew Gerhard; Mitchell, John Anthony; Littlewood, David John
2011-09-01
This report details efforts to deploy Agile Components for rapid development of a peridynamics code, Peridigm. The goal of Agile Components is to enable the efficient development of production-quality software by providing a well-defined, unifying interface to a powerful set of component-based software. Specifically, Agile Components facilitate interoperability among packages within the Trilinos Project, including data management, time integration, uncertainty quantification, and optimization. Development of the Peridigm code served as a testbed for Agile Components and resulted in a number of recommendations for future development. Agile Components successfully enabled rapid integration of Trilinos packages into Peridigm. A cost of thismore » approach, however, was a set of restrictions on Peridigm's architecture which impacted the ability to track history-dependent material data, dynamically modify the model discretization, and interject user-defined routines into the time integration algorithm. These restrictions resulted in modifications to the Agile Components approach, as implemented in Peridigm, and in a set of recommendations for future Agile Components development. Specific recommendations include improved handling of material states, a more flexible flow control model, and improved documentation. A demonstration mini-application, SimpleODE, was developed at the onset of this project and is offered as a potential supplement to Agile Components documentation.« less
Space-Based Reconfigurable Software Defined Radio Test Bed Aboard International Space Station
NASA Technical Reports Server (NTRS)
Reinhart, Richard C.; Lux, James P.
2014-01-01
The National Aeronautical and Space Administration (NASA) recently launched a new software defined radio research test bed to the International Space Station. The test bed, sponsored by the Space Communications and Navigation (SCaN) Office within NASA is referred to as the SCaN Testbed. The SCaN Testbed is a highly capable communications system, composed of three software defined radios, integrated into a flight system, and mounted to the truss of the International Space Station. Software defined radios offer the future promise of in-flight reconfigurability, autonomy, and eventually cognitive operation. The adoption of software defined radios offers space missions a new way to develop and operate space transceivers for communications and navigation. Reconfigurable or software defined radios with communications and navigation functions implemented in software or VHDL (Very High Speed Hardware Description Language) provide the capability to change the functionality of the radio during development or after launch. The ability to change the operating characteristics of a radio through software once deployed to space offers the flexibility to adapt to new science opportunities, recover from anomalies within the science payload or communication system, and potentially reduce development cost and risk by adapting generic space platforms to meet specific mission requirements. The software defined radios on the SCaN Testbed are each compliant to NASA's Space Telecommunications Radio System (STRS) Architecture. The STRS Architecture is an open, non-proprietary architecture that defines interfaces for the connections between radio components. It provides an operating environment to abstract the communication waveform application from the underlying platform specific hardware such as digital-to-analog converters, analog-to-digital converters, oscillators, RF attenuators, automatic gain control circuits, FPGAs, general-purpose processors, etc. and the interconnections among different radio components.
Open source electronic health record and patient data management system for intensive care.
Massaut, Jacques; Reper, Pascal
2008-01-01
In Intensive Care Units, the amount of data to be processed for patients care, the turn over of the patients, the necessity for reliability and for review processes indicate the use of Patient Data Management Systems (PDMS) and electronic health records (EHR). To respond to the needs of an Intensive Care Unit and not to be locked with proprietary software, we developed a PDMS and EHR based on open source software and components. The software was designed as a client-server architecture running on the Linux operating system and powered by the PostgreSQL data base system. The client software was developed in C using GTK interface library. The application offers to the users the following functions: medical notes captures, observations and treatments, nursing charts with administration of medications, scoring systems for classification, and possibilities to encode medical activities for billing processes. Since his deployment in February 2004, the PDMS was used to care more than three thousands patients with the expected software reliability and facilitated data management and review processes. Communications with other medical software were not developed from the start, and are realized by the use of the Mirth HL7 communication engine. Further upgrade of the system will include multi-platform support, use of typed language with static analysis, and configurable interface. The developed system based on open source software components was able to respond to the medical needs of the local ICU environment. The use of OSS for development allowed us to customize the software to the preexisting organization and contributed to the acceptability of the whole system.
Improving Software Sustainability: Lessons Learned from Profiles in Science.
Gallagher, Marie E
2013-01-01
The Profiles in Science® digital library features digitized surrogates of historical items selected from the archival collections of the U.S. National Library of Medicine as well as collaborating institutions. In addition, it contains a database of descriptive, technical and administrative metadata. It also contains various software components that allow creation of the metadata, management of the digital items, and access to the items and metadata through the Profiles in Science Web site [1]. The choices made building the digital library were designed to maximize the sustainability and long-term survival of all of the components of the digital library [2]. For example, selecting standard and open digital file formats rather than proprietary formats increases the sustainability of the digital files [3]. Correspondingly, using non-proprietary software may improve the sustainability of the software--either through in-house expertise or through the open source community. Limiting our digital library software exclusively to open source software or to software developed in-house has not been feasible. For example, we have used proprietary operating systems, scanning software, a search engine, and office productivity software. We did this when either lack of essential capabilities or the cost-benefit trade-off favored using proprietary software. We also did so knowing that in the future we would need to replace or upgrade some of our proprietary software, analogous to migrating from an obsolete digital file format to a new format as the technological landscape changes. Since our digital library's start in 1998, all of its software has been upgraded or replaced, but the digitized items have not yet required migration to other formats. Technological changes that compelled us to replace proprietary software included the cost of product licensing, product support, incompatibility with other software, prohibited use due to evolving security policies, and product abandonment. Sometimes these changes happen on short notice, so we continually monitor our library's software for signs of endangerment. We have attempted to replace proprietary software with suitable in-house or open source software. When the replacement involves a standalone piece of software with a nearly equivalent version, such as replacing a commercial HTTP server with an open source HTTP server, the replacement is straightforward. Recently we replaced software that functioned not only as our search engine but also as the backbone of the architecture of our Web site. In this paper, we describe the lessons learned and the pros and cons of replacing this software with open source software.
NASA Astrophysics Data System (ADS)
Alexander, K.; Easterbrook, S. M.
2015-01-01
We analyse the source code of eight coupled climate models, selected from those that participated in the CMIP5 (Taylor et al., 2012) or EMICAR5 (Eby et al., 2013; Zickfeld et al., 2013) intercomparison projects. For each model, we sort the preprocessed code into components and subcomponents based on dependency structure. We then create software architecture diagrams which show the relative sizes of these components/subcomponents and the flow of data between them. The diagrams also illustrate several major classes of climate model design; the distribution of complexity between components, which depends on historical development paths as well as the conscious goals of each institution; and the sharing of components between different modelling groups. These diagrams offer insights into the similarities and differences between models, and have the potential to be useful tools for communication between scientists, scientific institutions, and the public.
NASA Astrophysics Data System (ADS)
Alexander, K.; Easterbrook, S. M.
2015-04-01
We analyze the source code of eight coupled climate models, selected from those that participated in the CMIP5 (Taylor et al., 2012) or EMICAR5 (Eby et al., 2013; Zickfeld et al., 2013) intercomparison projects. For each model, we sort the preprocessed code into components and subcomponents based on dependency structure. We then create software architecture diagrams that show the relative sizes of these components/subcomponents and the flow of data between them. The diagrams also illustrate several major classes of climate model design; the distribution of complexity between components, which depends on historical development paths as well as the conscious goals of each institution; and the sharing of components between different modeling groups. These diagrams offer insights into the similarities and differences in structure between climate models, and have the potential to be useful tools for communication between scientists, scientific institutions, and the public.
Modelling and Implementation of Catalogue Cards Using FreeMarker
ERIC Educational Resources Information Center
Radjenovic, Jelen; Milosavljevic, Branko; Surla, Dusan
2009-01-01
Purpose: The purpose of this paper is to report on a study involving the specification (using Unified Modelling Language (UML) 2.0) of information requirements and implementation of the software components for generating catalogue cards. The implementation in a Java environment is developed using the FreeMarker software.…
Improving the Accuracy of Software-Based Energy Analysis for Residential Buildings (Presentation)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Polly, B.
2011-09-01
This presentation describes the basic components of software-based energy analysis for residential buildings, explores the concepts of 'error' and 'accuracy' when analysis predictions are compared to measured data, and explains how NREL is working to continuously improve the accuracy of energy analysis methods.
NASA Tech Briefs, December 1998. Volume 22, No. 12
NASA Technical Reports Server (NTRS)
1998-01-01
Topics include: special coverage section on design and analysis software, and sections on electronic components and circuits, electronic systems, software, materials, mechanics, machinery/automation, manufacturing/fabrication, physical sciences, and special sections of Photonics Tech Briefs, Motion Control Tech briefs and a Hot Technology File 1999 Resource Guide.
USDA-ARS?s Scientific Manuscript database
This paper provides an overview of the Model Optimization, Uncertainty, and SEnsitivity Analysis (MOUSE) software application, an open-source, Java-based toolbox of visual and numerical analysis components for the evaluation of environmental models. MOUSE is based on the OPTAS model calibration syst...
Designing an Electronic Classroom for Large College Courses.
ERIC Educational Resources Information Center
Aiken, Milam W.; Hawley, Delvin D.
1995-01-01
Describes a state-of-the-art electronic classroom at the University of Mississippi School of Business designed for large numbers of students and regularly scheduled classes. Highlights include: architecture of the room, hardware components, software utilized in the room, and group decision support system software and its uses. (JKP)
Mell, Matthew; Tefera, Girma; Thornton, Frank; Siepman, David; Turnipseed, William
2007-03-01
The diagnostic accuracy of magnetic resonance angiography (MRA) in the infrapopliteal arterial segment is not well defined. This study evaluated the clinical utility and diagnostic accuracy of time-resolved imaging of contrast kinetics (TRICKS) MRA compared with digital subtraction contrast angiography (DSA) in planning for percutaneous interventions of popliteal and infrapopliteal arterial occlusive disease. Patients who underwent percutaneous lower extremity interventions for popliteal or tibial occlusive disease were identified for this study. Preprocedural TRICKS MRA was performed with 1.5 Tesla (GE Healthcare, Waukesha, Wis) magnetic resonance imaging scanners with a flexible peripheral vascular coil, using the TRICKS technique with gadodiamide injection. DSA was performed using standard techniques in angiography suite with a 15-inch image intensifier. DSA was considered the gold standard. The MRA and DSA were then evaluated in a blinded fashion by a radiologist and a vascular surgeon. The popliteal artery and tibioperoneal trunk were evaluated separately, and the tibial arteries were divided into proximal, mid, and distal segments. Each segment was interpreted as normal (0% to 49% stenosis), stenotic (50% to 99% stenosis), or occluded (100%). Lesion morphology was classified according to the TransAtlantic Inter-Society Consensus (TASC). We calculated concordance between the imaging studies and the sensitivity and specificity of MRA. The clinical utility of MRA was also assessed in terms of identifying arterial access site as well as predicting technical success of the percutaneous treatment. Comparisons were done on 150 arterial segments in 30 limbs of 27 patients. When evaluated by TASC classification, TRICKS MRA correlated with DSA in 83% of the popliteal and in 88% of the infrapopliteal segments. MRA correctly identified significant disease of the popliteal artery with a sensitivity of 94% and a specificity of 92%, and of the tibial arteries with a sensitivity of 100% and specificity of 84%. When used to evaluate for stenosis vs occlusion, MRA interpretation agreed with DSA 90% of the time. Disagreement occurred in 15 arterial segments, most commonly in distal tibioperoneal arteries. MRA misdiagnosed occlusion for stenosis in 11 of 15 segments, and stenosis for occlusion in four of 15 segments. Arterial access was accurately planned based on preprocedural MRA findings in 29 of 30 patients. MRA predicted technical success 83% of the time. Five technical failures were due to inability to cross arterial occlusions, all accurately identified by MRA. TRICKS MRA is an accurate method of evaluating patients for popliteal and infrapopliteal arterial occlusive disease and can be used for planning percutaneous interventions.
Probabilistic structural analysis methods for select space propulsion system components
NASA Technical Reports Server (NTRS)
Millwater, H. R.; Cruse, T. A.
1989-01-01
The Probabilistic Structural Analysis Methods (PSAM) project developed at the Southwest Research Institute integrates state-of-the-art structural analysis techniques with probability theory for the design and analysis of complex large-scale engineering structures. An advanced efficient software system (NESSUS) capable of performing complex probabilistic analysis has been developed. NESSUS contains a number of software components to perform probabilistic analysis of structures. These components include: an expert system, a probabilistic finite element code, a probabilistic boundary element code and a fast probability integrator. The NESSUS software system is shown. An expert system is included to capture and utilize PSAM knowledge and experience. NESSUS/EXPERT is an interactive menu-driven expert system that provides information to assist in the use of the probabilistic finite element code NESSUS/FEM and the fast probability integrator (FPI). The expert system menu structure is summarized. The NESSUS system contains a state-of-the-art nonlinear probabilistic finite element code, NESSUS/FEM, to determine the structural response and sensitivities. A broad range of analysis capabilities and an extensive element library is present.
JAMS - a software platform for modular hydrological modelling
NASA Astrophysics Data System (ADS)
Kralisch, Sven; Fischer, Christian
2015-04-01
Current challenges of understanding and assessing the impacts of climate and land use changes on environmental systems demand for an ever-increasing integration of data and process knowledge in corresponding simulation models. Software frameworks that allow for a seamless creation of integrated models based on less complex components (domain models, process simulation routines) have therefore gained increasing attention during the last decade. JAMS is an Open-Source software framework that has been especially designed to cope with the challenges of eco-hydrological modelling. This is reflected by (i) its flexible approach for representing time and space, (ii) a strong separation of process simulation components from the declarative description of more complex models using domain specific XML, (iii) powerful analysis and visualization functions for spatial and temporal input and output data, and (iv) parameter optimization and uncertainty analysis functions commonly used in environmental modelling. Based on JAMS, different hydrological and nutrient-transport simulation models were implemented and successfully applied during the last years. We will present the JAMS core concepts and give an overview of models, simulation components and support tools available for that framework. Sample applications will be used to underline the advantages of component-based model designs and to show how JAMS can be used to address the challenges of integrated hydrological modelling.
NASA Technical Reports Server (NTRS)
Condon, Steven; Hendrick, Robert; Stark, Michael E.; Steger, Warren
1997-01-01
The Flight Dynamics Division (FDD) of NASA's Goddard Space Flight Center (GSFC) recently embarked on a far-reaching revision of its process for developing and maintaining satellite support software. The new process relies on an object-oriented software development method supported by a domain specific library of generalized components. This Generalized Support Software (GSS) Domain Engineering Process is currently in use at the NASA GSFC Software Engineering Laboratory (SEL). The key facets of the GSS process are (1) an architecture for rapid deployment of FDD applications, (2) a reuse asset library for FDD classes, and (3) a paradigm shift from developing software to configuring software for mission support. This paper describes the GSS architecture and process, results of fielding the first applications, lessons learned, and future directions
Software reliability experiments data analysis and investigation
NASA Technical Reports Server (NTRS)
Walker, J. Leslie; Caglayan, Alper K.
1991-01-01
The objectives are to investigate the fundamental reasons which cause independently developed software programs to fail dependently, and to examine fault tolerant software structures which maximize reliability gain in the presence of such dependent failure behavior. The authors used 20 redundant programs from a software reliability experiment to analyze the software errors causing coincident failures, to compare the reliability of N-version and recovery block structures composed of these programs, and to examine the impact of diversity on software reliability using subpopulations of these programs. The results indicate that both conceptually related and unrelated errors can cause coincident failures and that recovery block structures offer more reliability gain than N-version structures if acceptance checks that fail independently from the software components are available. The authors present a theory of general program checkers that have potential application for acceptance tests.
A Practical Application of Microcomputers to Control an Active Solar System.
ERIC Educational Resources Information Center
Goldman, David S.; Warren, William
1984-01-01
Describes the design and implementation of a microcomputer-based model active solar heating system. Includes discussions of: (1) the active solar components (solar collector, heat exchanger, pump, and fan necessary to provide forced air heating); (2) software components; and (3) hardware components (in the form of sensors and actuators). (JN)
NEXUS - Resilient Intelligent Middleware
NASA Astrophysics Data System (ADS)
Kaveh, N.; Hercock, R. Ghanea
Service-oriented computing, a composition of distributed-object computing, component-based, and Web-based concepts, is becoming the widespread choice for developing dynamic heterogeneous software assets available as services across a network. One of the major strengths of service-oriented technologies is the high abstraction layer and large granularity level at which software assets are viewed compared to traditional object-oriented technologies. Collaboration through encapsulated and separately defined service interfaces creates a service-oriented environment, whereby multiple services can be linked together through their interfaces to compose a functional system. This approach enables better integration of legacy and non-legacy services, via wrapper interfaces, and allows for service composition at a more abstract level especially in cases such as vertical market stacks. The heterogeneous nature of service-oriented technologies and the granularity of their software components makes them a suitable computing model in the pervasive domain.
Software Considerations for Subscale Flight Testing of Experimental Control Laws
NASA Technical Reports Server (NTRS)
Murch, Austin M.; Cox, David E.; Cunningham, Kevin
2009-01-01
The NASA AirSTAR system has been designed to address the challenges associated with safe and efficient subscale flight testing of research control laws in adverse flight conditions. In this paper, software elements of this system are described, with an emphasis on components which allow for rapid prototyping and deployment of aircraft control laws. Through model-based design and automatic coding a common code-base is used for desktop analysis, piloted simulation and real-time flight control. The flight control system provides the ability to rapidly integrate and test multiple research control laws and to emulate component or sensor failures. Integrated integrity monitoring systems provide aircraft structural load protection, isolate the system from control algorithm failures, and monitor the health of telemetry streams. Finally, issues associated with software configuration management and code modularity are briefly discussed.
Large-scale structural optimization
NASA Technical Reports Server (NTRS)
Sobieszczanski-Sobieski, J.
1983-01-01
Problems encountered by aerospace designers in attempting to optimize whole aircraft are discussed, along with possible solutions. Large scale optimization, as opposed to component-by-component optimization, is hindered by computational costs, software inflexibility, concentration on a single, rather than trade-off, design methodology and the incompatibility of large-scale optimization with single program, single computer methods. The software problem can be approached by placing the full analysis outside of the optimization loop. Full analysis is then performed only periodically. Problem-dependent software can be removed from the generic code using a systems programming technique, and then embody the definitions of design variables, objective function and design constraints. Trade-off algorithms can be used at the design points to obtain quantitative answers. Finally, decomposing the large-scale problem into independent subproblems allows systematic optimization of the problems by an organization of people and machines.
Preparing a scientific manuscript in Linux: Today's possibilities and limitations
2011-01-01
Background Increasing number of scientists are enthusiastic about using free, open source software for their research purposes. Authors' specific goal was to examine whether a Linux-based operating system with open source software packages would allow to prepare a submission-ready scientific manuscript without the need to use the proprietary software. Findings Preparation and editing of scientific manuscripts is possible using Linux and open source software. This letter to the editor describes key steps for preparation of a publication-ready scientific manuscript in a Linux-based operating system, as well as discusses the necessary software components. This manuscript was created using Linux and open source programs for Linux. PMID:22018246
Exploiting IoT Technologies and Open Source Components for Smart Seismic Network Instrumentation
NASA Astrophysics Data System (ADS)
Germenis, N. G.; Koulamas, C. A.; Foundas, P. N.
2017-12-01
The data collection infrastructure of any seismic network poses a number of requirements and trade-offs related to accuracy, reliability, power autonomy and installation & operational costs. Having the right hardware design at the edge of this infrastructure, embedded software running inside the instruments is the heart of pre-processing and communication services implementation and their integration with the central storage and processing facilities of the seismic network. This work demonstrates the feasibility and benefits of exploiting software components from heterogeneous sources in order to realize a smart seismic data logger, achieving higher reliability, faster integration and less development and testing costs of critical functionality that is in turn responsible for the cost and power efficient operation of the device. The instrument's software builds on top of widely used open source components around the Linux kernel with real-time extensions, the core Debian Linux distribution, the earthworm and seiscomp tooling frameworks, as well as components from the Internet of Things (IoT) world, such as the CoAP and MQTT protocols for the signaling planes, besides the widely used de-facto standards of the application domain at the data plane, such as the SeedLink protocol. By using an innovative integration of features based on lower level GPL components of the seiscomp suite with higher level processing earthworm components, coupled with IoT protocol extensions to the latter, the instrument can implement smart functionality such as network controlled, event triggered data transmission in parallel with edge archiving and on demand, short term historical data retrieval.
NASA Technical Reports Server (NTRS)
Fournelle, John; Carpenter, Paul
2006-01-01
Modem electron microprobe systems have become increasingly sophisticated. These systems utilize either UNIX or PC computer systems for measurement, automation, and data reduction. These systems have undergone major improvements in processing, storage, display, and communications, due to increased capabilities of hardware and software. Instrument specifications are typically utilized at the time of purchase and concentrate on hardware performance. The microanalysis community includes analysts, researchers, software developers, and manufacturers, who could benefit from exchange of ideas and the ultimate development of core community specifications (CCS) for hardware and software components of microprobe instrumentation and operating systems.
The RISC (Reduced Instruction Set Computer) Architecture and Computer Performance Evaluation.
1986-03-01
time where the main emphasis of the evaluation process is put on the software . The model is intended to provide a tool for computer architects to use...program, or 3) Was to be implemented in random logic more effec- tively than the equivalent sequence of software instructions. Both data and address...definition is the IEEE standard 729-1983 stating Computer Architecture as: " The process of defining a collection of hardware and software components and
2012-03-13
Brownfield Development 6.6.6 Agile and Kanban Development 6.6.7 Putting It All Together at the Large-Project or Enterprise Level 6.7 References 7...Ultrahigh software system assurance; Legacy maintenance and brownfield development; and Agile and kanban development. This chapter summarizes each...components or may need to be developed for special systems. Legacy Maintenance and Brownfield Development Fewer and fewer software-intensive systems have
DOE Office of Scientific and Technical Information (OSTI.GOV)
Humble, Travis S; Sadlier, Ronald J
We show how to extend the paradigm of software-defined communication to include quantum communication systems. We introduce the decomposition of a quantum communication terminal into layers separating the concerns of the hardware, software, and middleware. We provide detailed descriptions of how each component operates and we include results of an implementation of the super-dense coding protocol. We argue that the versatility of software-defined quantum communication test beds can be useful for exploring new regimes in communication and rapidly prototyping new systems.
Air Force and the Cyberspace Mission: Defending the Air Force’s Computer Network in the Future
2007-12-01
computers, their operating systems and software purchased by the Air Force are commercial off-the-shelf (COTS) components, often manufactured abroad due...crystal clear 2003 information security report: “The U.S. Department of Defense (DOD) relies too much on commercial software , doesn’t know who is...creating the software , and faces other significant cybersecurity problems.”11 This paper explores the topic of defense of the cyberspace domain by
InterFace: A software package for face image warping, averaging, and principal components analysis.
Kramer, Robin S S; Jenkins, Rob; Burton, A Mike
2017-12-01
We describe InterFace, a software package for research in face recognition. The package supports image warping, reshaping, averaging of multiple face images, and morphing between faces. It also supports principal components analysis (PCA) of face images, along with tools for exploring the "face space" produced by PCA. The package uses a simple graphical user interface, allowing users to perform these sophisticated image manipulations without any need for programming knowledge. The program is available for download in the form of an app, which requires that users also have access to the (freely available) MATLAB Runtime environment.
Applications of AN OO Methodology and Case to a Daq System
NASA Astrophysics Data System (ADS)
Bee, C. P.; Eshghi, S.; Jones, R.; Kolos, S.; Magherini, C.; Maidantchik, C.; Mapelli, L.; Mornacchi, G.; Niculescu, M.; Patel, A.; Prigent, D.; Spiwoks, R.; Soloviev, I.; Caprini, M.; Duval, P. Y.; Etienne, F.; Ferrato, D.; Le van Suu, A.; Qian, Z.; Gaponenko, I.; Merzliakov, Y.; Ambrosini, G.; Ferrari, R.; Fumagalli, G.; Polesello, G.
The RD13 project has evaluated the use of the Object Oriented Information Engineering (OOIE) method during the development of several software components connected to the DAQ system. The method is supported by a sophisticated commercial CASE tool (Object Management Workbench) and programming environment (Kappa) which covers the full life-cycle of the software including model simulation, code generation and application deployment. This paper gives an overview of the method, CASE tool, DAQ components which have been developed and we relate our experiences with the method and tool, its integration into our development environment and the spiral lifecycle it supports.
Thermomechanical Multiaxial Fatigue Testing Capability Developed
NASA Technical Reports Server (NTRS)
1996-01-01
Structural components in aeronautical gas turbine engines typically experience multiaxial states of stress under nonisothermal conditions. To estimate the durability of the various components in the engine, one must characterize the cyclic deformation and fatigue behavior of the materials used under thermal and complex mechanical loading conditions. To this end, a testing protocol and associated test control software were developed at the NASA Lewis Research Center for thermomechanical axial-torsional fatigue tests. These tests are to be performed on thin-walled, tubular specimens fabricated from the cobalt-based superalloy Haynes 188. The software is written in C and runs on an MS-DOS based microcomputer.
JTAG-based remote configuration of FPGAs over optical fibers
Deng, B.; Xu, H.; Liu, C.; ...
2015-01-28
In this study, a remote FPGA-configuration method based on JTAG extension over optical fibers is presented. The method takes advantage of commercial components and ready-to-use software such as iMPACT and does not require any hardware or software development. The method combines the advantages of the slow remote JTAG configuration and the fast local flash memory configuration. The method has been verified successfully and used in the Demonstrator of Liquid-Argon Trigger Digitization Board (LTDB) for the ATLAS liquid argon calorimeter Phase-I trigger upgrade. All components on the FPGA side are verified to meet the radiation tolerance requirements.
A cyber infrastructure for the SKA Telescope Manager
NASA Astrophysics Data System (ADS)
Barbosa, Domingos; Barraca, João. P.; Carvalho, Bruno; Maia, Dalmiro; Gupta, Yashwant; Natarajan, Swaminathan; Le Roux, Gerhard; Swart, Paul
2016-07-01
The Square Kilometre Array Telescope Manager (SKA TM) will be responsible for assisting the SKA Operations and Observation Management, carrying out System diagnosis and collecting Monitoring and Control data from the SKA subsystems and components. To provide adequate compute resources, scalability, operation continuity and high availability, as well as strict Quality of Service, the TM cyber-infrastructure (embodied in the Local Infrastructure - LINFRA) consists of COTS hardware and infrastructural software (for example: server monitoring software, host operating system, virtualization software, device firmware), providing a specially tailored Infrastructure as a Service (IaaS) and Platform as a Service (PaaS) solution. The TM infrastructure provides services in the form of computational power, software defined networking, power, storage abstractions, and high level, state of the art IaaS and PaaS management interfaces. This cyber platform will be tailored to each of the two SKA Phase 1 telescopes (SKA_MID in South Africa and SKA_LOW in Australia) instances, each presenting different computational and storage infrastructures and conditioned by location. This cyber platform will provide a compute model enabling TM to manage the deployment and execution of its multiple components (observation scheduler, proposal submission tools, MandC components, Forensic tools and several Databases, etc). In this sense, the TM LINFRA is primarily focused towards the provision of isolated instances, mostly resorting to virtualization technologies, while defaulting to bare hardware if specifically required due to performance, security, availability, or other requirement.
14 CFR 1274.915 - Restrictions on sale or transfer of technology to foreign firms or institutions.
Code of Federal Regulations, 2013 CFR
2013-01-01
... licensing of the technology. Transfers include: (1) Sales of products or components, (2) Licenses of software or documentation related to sales of products or components, or (3) Transfers to foreign...
14 CFR § 1274.915 - Restrictions on sale or transfer of technology to foreign firms or institutions.
Code of Federal Regulations, 2014 CFR
2014-01-01
... licensing of the technology. Transfers include: (1) Sales of products or components, (2) Licenses of software or documentation related to sales of products or components, or (3) Transfers to foreign...
14 CFR 1274.915 - Restrictions on sale or transfer of technology to foreign firms or institutions.
Code of Federal Regulations, 2012 CFR
2012-01-01
... licensing of the technology. Transfers include: (1) Sales of products or components, (2) Licenses of software or documentation related to sales of products or components, or (3) Transfers to foreign...
NASA Technical Reports Server (NTRS)
Wang, Nanbor; Parameswaran, Kirthika; Kircher, Michael; Schmidt, Douglas
2003-01-01
Although existing CORBA specifications, such as Real-time CORBA and CORBA Messaging, address many end-to-end quality-of service (QoS) properties, they do not define strategies for configuring these properties into applications flexibly, transparently, and adaptively. Therefore, application developers must make these configuration decisions manually and explicitly, which is tedious, error-prone, and open sub-optimal. Although the recently adopted CORBA Component Model (CCM) does define a standard configuration framework for packaging and deploying software components, conventional CCM implementations focus on functionality rather than adaptive quality-of-service, which makes them unsuitable for next-generation applications with demanding QoS requirements. This paper presents three contributions to the study of middleware for QoS-enabled component-based applications. It outlines rejective middleware techniques designed to adaptively (1) select optimal communication mechanisms, (2) manage QoS properties of CORBA components in their contain- ers, and (3) (re)con$gure selected component executors dynamically. Based on our ongoing research on CORBA and the CCM, we believe the application of rejective techniques to component middleware will provide a dynamically adaptive and (re)configurable framework for COTS software that is well-suited for the QoS demands of next-generation applications.
NASA Technical Reports Server (NTRS)
Wang, Nanbor; Kircher, Michael; Schmidt, Douglas C.
2000-01-01
Although existing CORBA specifications, such as Real-time CORBA and CORBA Messaging, address many end-to-end quality-of-service (QoS) properties, they do not define strategies for configuring these properties into applications flexibly, transparently, and adaptively. Therefore, application developers must make these configuration decisions manually and explicitly, which is tedious, error-prone, and often sub-optimal. Although the recently adopted CORBA Component Model (CCM) does define a standard configuration frame-work for packaging and deploying software components, conventional CCM implementations focus on functionality rather than adaptive quality-of service, which makes them unsuitable for next-generation applications with demanding QoS requirements. This paper presents three contributions to the study of middleware for QoS-enabled component-based applications. It outlines reflective middleware techniques designed to adaptively: (1) select optimal communication mechanisms, (2) man- age QoS properties of CORBA components in their containers, and (3) (re)configure selected component executors dynamically. Based on our ongoing research on CORBA and the CCM, we believe the application of reflective techniques to component middleware will provide a dynamically adaptive and (re)configurable framework for COTS software that is well-suited for the QoS demands of next-generation applications.
Efficient abstract data type components for distributed and parallel systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bastani, F.; Hilal, W.; Iyengar, S.S.
1987-10-01
One way of improving software system's comprehensibility and maintainability is to decompose it into several components, each of which encapsulates some information concerning the system. These components can be classified into four categories, namely, abstract data type, functional, interface, and control components. Such a classfication underscores the need for different specification, implementation, and performance-improvement methods for different types of components. This article focuses on the development of high-performance abstract data type components for distributed and parallel environments.
An Integrated Software Suite for Surface-based Analyses of Cerebral Cortex
Van Essen, David C.; Drury, Heather A.; Dickson, James; Harwell, John; Hanlon, Donna; Anderson, Charles H.
2001-01-01
The authors describe and illustrate an integrated trio of software programs for carrying out surface-based analyses of cerebral cortex. The first component of this trio, SureFit (Surface Reconstruction by Filtering and Intensity Transformations), is used primarily for cortical segmentation, volume visualization, surface generation, and the mapping of functional neuroimaging data onto surfaces. The second component, Caret (Computerized Anatomical Reconstruction and Editing Tool Kit), provides a wide range of surface visualization and analysis options as well as capabilities for surface flattening, surface-based deformation, and other surface manipulations. The third component, SuMS (Surface Management System), is a database and associated user interface for surface-related data. It provides for efficient insertion, searching, and extraction of surface and volume data from the database. PMID:11522765
Cementitious Barriers Partnership FY2013 End-Year Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Flach, G. P.; Langton, C. A.; Burns, H. H.
2013-11-01
In FY2013, the Cementitious Barriers Partnership (CBP) demonstrated continued tangible progress toward fulfilling the objective of developing a set of software tools to improve understanding and prediction of the long-term structural, hydraulic and chemical performance of cementitious barriers used in nuclear applications. In November 2012, the CBP released “Version 1.0” of the CBP Software Toolbox, a suite of software for simulating reactive transport in cementitious materials and important degradation phenomena. In addition, the CBP completed development of new software for the “Version 2.0” Toolbox to be released in early FY2014 and demonstrated use of the Version 1.0 Toolbox on DOEmore » applications. The current primary software components in both Versions 1.0 and 2.0 are LeachXS/ORCHESTRA, STADIUM, and a GoldSim interface for probabilistic analysis of selected degradation scenarios. The CBP Software Toolbox Version 1.0 supports analysis of external sulfate attack (including damage mechanics), carbonation, and primary constituent leaching. Version 2.0 includes the additional analysis of chloride attack and dual regime flow and contaminant migration in fractured and non-fractured cementitious material. The LeachXS component embodies an extensive material property measurements database along with chemical speciation and reactive mass transport simulation cases with emphasis on leaching of major, trace and radionuclide constituents from cementitious materials used in DOE facilities, such as Saltstone (Savannah River) and Cast Stone (Hanford), tank closure grouts, and barrier concretes. STADIUM focuses on the physical and structural service life of materials and components based on chemical speciation and reactive mass transport of major cement constituents and aggressive species (e.g., chloride, sulfate, etc.). THAMES is a planned future CBP Toolbox component focused on simulation of the microstructure of cementitious materials and calculation of resultant hydraulic and constituent mass transfer parameters needed in modeling. Two CBP software demonstrations were conducted in FY2013, one to support the Saltstone Disposal Facility (SDF) at SRS and the other on a representative Hanford high-level waste tank. The CBP Toolbox demonstration on the SDF provided analysis on the most probable degradation mechanisms to the cementitious vault enclosure caused by sulfate and carbonation ingress. This analysis was documented and resulted in the issuance of a SDF Performance Assessment Special Analysis by Liquid Waste Operations this fiscal year. The two new software tools supporting chloride attack and dual-regime flow will provide additional degradation tools to better evaluate performance of DOE and commercial cementitious barriers. The CBP SRNL experimental program produced two patent applications and field data that will be used in the development and calibration of CBP software tools being developed in FY2014. The CBP software and simulation tools varies from other efforts in that all the tools are based upon specific and relevant experimental research of cementitious materials utilized in DOE applications. The CBP FY2013 program involved continuing research to improve and enhance the simulation tools as well as developing new tools that model other key degradation phenomena not addressed in Version 1.0. Also efforts to continue to verify the various simulation tools through laboratory experiments and analysis of field specimens are ongoing and will continue into FY2014 to quantify and reduce the uncertainty associated with performance assessments. This end-year report summarizes FY2013 software development efforts and the various experimental programs that are providing data for calibration and validation of the CBP developed software.« less
A Probabilistic Software System Attribute Acceptance Paradigm for COTS Software Evaluation
NASA Technical Reports Server (NTRS)
Morris, A. Terry
2005-01-01
Standard software requirement formats are written from top-down perspectives only, that is, from an ideal notion of a client s needs. Despite the exactness of the standard format, software and system errors in designed systems have abounded. Bad and inadequate requirements have resulted in cost overruns, schedule slips and lost profitability. Commercial off-the-shelf (COTS) software components are even more troublesome than designed systems because they are often provided as is and subsequently delivered with unsubstantiated validation of described capabilities. For COTS software, there needs to be a way to express the client s software needs in a consistent and formal manner using software system attributes derived from software quality standards. Additionally, the format needs to be amenable to software evaluation processes that integrate observable evidence garnered from historical data. This paper presents a paradigm that effectively bridges the gap between what a client desires (top-down) and what has been demonstrated (bottom-up) for COTS software evaluation. The paradigm addresses the specification of needs before the software evaluation is performed and can be used to increase the shared understanding between clients and software evaluators about what is required and what is technically possible.
Availability of software services for a hospital information system.
Sakamoto, N
1998-03-01
Hospital information systems (HISs) are becoming more important and covering more parts in daily hospital operations as order-entry systems become popular and electronic charts are introduced. Thus, HISs today need to be able to provide necessary services for hospital operations for a 24-h day, 365 days a year. The provision of services discussed here does not simply mean the availability of computers, in which all that matters is that the computer is functioning. It means the provision of necessary information for hospital operations by the computer software, and we will call it the availability of software services. HISs these days are mostly client-server systems. To increase availability of software services in these systems, it is not enough to just use system structures that are highly reliable in existing host-centred systems. Four main components which support availability of software services are network systems, client computers, server computers, and application software. In this paper, we suggest how to structure these four components to provide the minimum requested software services even if a part of the system stops to function. The network system should be double-protected in stratus using Asynchronous Transfer Mode (ATM) as its base network. Client computers should be fat clients with as much application logic as possible, and reference information which do not require frequent updates (master files, for example) should be replicated in clients. It would be best if all server computers could be double-protected. However, if that is physically impossible, one database file should be made accessible by several server computers. Still, at least the basic patients' information and the latest clinical records should be double-protected physically. Application software should be tested carefully before introduction. Different versions of the application software should always be kept and managed in case the new version has problems. If a hospital information system is designed and developed with these points in mind, it's availability of software services should increase greatly.
ERIC Educational Resources Information Center
Cordier, Deborah
2009-01-01
A renewed focus on foreign language (FL) learning and speech for communication has resulted in computer-assisted language learning (CALL) software developed with Automatic Speech Recognition (ASR). ASR features for FL pronunciation (Lafford, 2004) are functional components of CALL designs used for FL teaching and learning. The ASR features…
Criteria for Evaluating and Selecting Multimedia Software for Instruction.
ERIC Educational Resources Information Center
Lee, Sung Heum; And Others
Evaluating and selecting the appropriate software is a very important component of success in using multimedia systems in both educational and corporate settings. Computer-mediated multimedia (CMM) is the integration of two or more communication media, controlled or manipulated by the user via a computer, to present information. CMM can be…
2001-09-01
Oriented Discrete Event Simulation,” Master’s Thesis in Operations Research, Naval Postgraduate School Monterey, CA, 1996. 12. Arntzen , A., “Software...Dependent Hit Probabilities”, Naval Research Logistics, Vol. 31, pp. 363-371, 1984. 3 Arntzen , A., “Software Components for Air Defense Planning