Page History

Single Event Effects Analysis

Pablo Pita Leira edited this page on 6 Feb 2018

Clone this wiki locally

Multi-Scale High Accuracy Engineering Tools for Single Event Effects Analysis in Modern Technologies - EXPRO+

Relevant information

  • [RD1] CREME96 A.J. Tylka, J. H. Adams, Jr., P. R. Boberg, B. Brownstein, W. F. Dietrich, E. O. Flueckiger, E. L. Petersen, M. A. Shea, D. F. Smart, and E. C. Smith, "CREME96: A Revision of the Cosmic Ray Effects on Micro-Electronics Code", IEEE Trans. Nucl. Sci., vol. 44, no. 6, pp. 2150-2160, Dec. 1997.
  • [RD2] GRAS: G. Santin, V. Ivanchenko, H. Evans, P. Nieminen, E. Daly, "GRAS: A general-purpose 3-D modular simulation tool for space environment effects analysis", IEEE Trans. Nucl. Sci. 52, Issue 6, 2005, pp 2294 – 2299. URL: http://space-env.esa.int/index.php/geant4-radiation-analysis-for-space.html
  • [RD3] GEMAT URL: http://spitfire.estec.esa.int/trac/GEMAT/
  • [RD4] CIRSOS URL: http://spitfire.estec.esa.int/trac/CIRSOS/
  • [RD5] MUSCA SEP3: G. Hubert, S. Duzellier, C. Inguimbert, C. Boatella Polo, F. Bezerra, R. Ecoffet, “Operational SER Calculations on the SAC-C Orbit Using the Multi-Scales Single Event Phenomena Predictive Platform (MUSCA SEP3)”, IEEE Trans. Nucl. Sci., no. 56, no. 6, pp. 3032-3042, Dec. 2009. URL: http://www.onera.fr/en/desp/musca-sep3 (not found). Other: https://www.onera.fr/sites/default/files/actualites/agenda/theses/soutenance-Ahmad-Al-Youssef-25102017.pdf
  • [RD6] CRÈME-MC R.A. Weller, M. H. Mendenhall, R. A.Reed, R. D. Schrimpf, K. M. Warren, B. D. Sierawski, and L. W. Massengill, “Monte carlo simulation of single event effects,” IEEE Trans. Nucl. Sci., vol. 57, no. 4, pp. 1726-1746, Aug. 2010.
  • [RD7] DESMICREX: Evaluation of Radiation Effects in Deep Sub-Micron CMOS technologies, ESA contract 21855/09/NL/J K

Work Logic Summary

The work shall be executed in the following tasks, organised in two technical phases:

Phase 1
  • Task 1: Identification of candidate devices
Phase 2
  • Task 2: Review of new modelling capabilities and requirements
  • Task 3: Review of pre-flight behaviour predictions
  • Task 4: Analysis of device technology
  • Task 5: Model development
  • Task 6: Prototype tool development
  • Task 7: Model and tool verification and validation, experimental tests
  • Task 8: Proposals for prediction/analysis methodology for SEE in modern technologies

Proposal

My responsabilities:

  • Data Access Working Group (DAWG), contact Eumetsat for that in Darmstadt.

Personnel

  • Analyst Assistant (DAWG)

Task 1: Identification of candidate devices (IRS Stuttgart?)

Technical Phase 1 Task 1: Identification of candidate devices

Input SoW. Task description:

  • Survey in-flight SEE behaviour of electronics on-board European missions (ESA, EUMETSAT preferred), with emphasis on obtaining information on the most advanced technologies possible.
  • The work under Task 1 will be organised in a Data Access Working Group (DAWG) under the Contractor responsibility. The DAWG shall include key persons from ESA, nominated by the ESA Technical Officer, including Project teams, to assist the Contractor in data access issues.
  • Identify the key data types needed to perform analysis of in-flight SEE behaviour, and verify its availability (and the related terms of use) by discussing with flight operations teams. The Agency encourages in particular access to data from Copernicus Sentinel-1, -2, and -3 missions due to the concurrent availability of pre- and post-launch information. Other missions for consideration include LPF, Proba, GAIA, SWARM, Alphasat, etc. if the Contractor thinks that they bring added value to the activity and that sufficient information will be available for the execution of the Task. The data shall allow the part experiencing the SEE to be identified and shall include time and location of the spacecraft at the time of the SEE in order to support distinction between cosmic rays, solar particle events and radiation belts as the cause.
  • Verify availability of, and demonstrate access to, highly relevant test data by discussing with project teams and ESA TEC-QEC section. Test data shall be available to ESA personnel
  • Verify availability of, and demonstrate access to, technology data (data on component sensitive nodes’ structures and materials, and operations (“live times”, duty cycles, etc.). Information on technologies shall be available to ESA personnel.
  • Summarize all gathered information and propose a list of candidate devices for detailed analysis to be performed in the subsequent tasks of the activity. Candidate devices shall include different types, e.g. memories, microprocessors, FPGAs, linear and power, converters, amplifiers, etc.
  • A preliminary verification and validation plan shall be proposed for the activity to be performed in Task 7, including a test plan for the experimental validation with related expected benefits and likely costs. The Contractor shall be in charge of the test part procurement. The procured parts shall be of the same reference and technology as the flight parts.
  • After completing the above tasks including the delivery of the related documentation to the Agency, the Contractor shall organise a Device Identification Review (DIR) according to Section 4.3.1

Output / Approval conditions

  • TN1.1 “Identification of candidate devices for detailed analysis” including a list of candidate devices supported by all related gathered information / to be approved at Device Identification Review (DIR).
  • TN1.2 Verification and Validation Plan (Preliminary) / to be approved at Device Identification Review (DIR)

Task 2: Review of new modelling capabilities and requirements

Input: TN1.1, DIR output

Task description

  • On-going developments in Europe and elsewhere and complementary activities in the domain of SEE modelling shall be comprehensively evaluated in the context of the in-flight anomaly experience of the parts selected by the Device Identification Review. The review shall include (but not be limited to) embryonic or recently established next generation SEE tools such as MUSCA SEP3 [RD5], DESMICREX [RD7], CREME-MC [RD6], GEMAT [RD3]. o The device or system information needed as input to the algorithms used by the tools shall be critically reviewed. Output / Approval conditions
  • TN2.1 Review of new modelling capabilities and requirements / System Requirements Review (SRR)

Task 3: Review of pre-flight behaviour predictions

Input: TN1.1, TN2.1, DIR output

Task description

  • For the devices selected at DIR, obtain ground test reports and radiation analyses compiled at the time of project development, as well as latest relevant test results coming from other sources.
  • Review pre-flight radiation assessment of the part (including assumptions on device geometry, functional details, and mission or system operation).
  • Obtain in-flight performance data for the devices selected at DIR.
  • Calculate expected rates for those devices based both on traditional methods and tools, and (where possible) on next generation SEE tools identified in Task 2. Compare the predictions to the original part assessment and to the observed performance
  • Identify major discrepancies between predictions, tests and in-flight performance. Assess quality and completeness of test data. Discuss main results, missing technology information and possible shortcomings of both traditional and new prediction methods.
  • Based on this analysis and on the review of new modelling capabilities (Task 2), formulate preliminary requirements for an advanced SEE modelling strategy, including part investigations and new model developments to be performed under Task 4 and Task 5. After completing the above tasks including the delivery of the related documentation to the Agency, the Contractor shall organise a System Requirements Review (SRR) according to Section 4.3.2

Output / Approval conditions

  • TN3.1 In-flight SEE predictions and comparison to flight data / to be approved at System Requirements Review (SRR)
  • TN1.2 Verification and Validation Plan (Updated) / to be approved at System Requirements Review (SRR)

Task 4: Analysis of device technology

Input TN1.1, TN2.1, TN3.1, outcome of SRR

Task description

  • For the devices selected at DIR, obtain manufacturer information on related technology, including 3-D device structure, and any other input parameters required by next generation SEE prediction algorithms (as identified in Task 2), to be used later in the simulations (Task 5)
  • In case insufficient information on the 3D structures is available, alternative ways of obtaining the necessary information shall be pursued, including e.g. device reverse engineering

Output / Approval conditions

  • TN4.1 Analysis of device technology/ to be approved at Model Readiness Review (MRR)

Task 5: Model development

Input: TN1.1, TN2.1, TN3.1, outcome of SRR, TN4.1

Task description

  • Set up Geant4 simulation of (i) space situation; (ii) ground test set-up; and compute rates including full 3-D structure and physics. The simulations shall preferably use GRAS and/or GEMAT, via the CIRSOS [RD4] simulation framework.
  • Based on the outcome of the previous reviews and part investigations, as mutually agreed at SRR, the Contractor shall develop new algorithms and modelling methods for prediction of SEE both in ground-test and in-flight conditions. Modelling capabilities shall include description of charge creation and collection induced by charged particles, including low- energy and secondary particle transport in semiconductor materials. Where necessary, as agreed at SRR, processes including charge collection in deep-sub-micron devices, electron- hole pair transport in dynamic electric fields, the effects of nuclear interactions in overlayers and interconnects, and non-homogenous, complex sensitive volume geometries and sensitivities shall be treated.
  • During development, the new algorithm and model results shall be iteratively compared with Geant4 simulations.
  • Where possible the development activities shall be coordinated with ongoing activities in Europe and elsewhere
  • User requirements, software design and software requirements shall be produced in preparation for the tool development of Task 6.

Output / Approval conditions

  • SW5.1 Geant4 models including source code, geometry and input scripts
  • TN5.1 Model and algorithm documentation
  • SW5.2 New models for SEE prediction
  • TN5.2 User Requirements Document, Software Design Document, Software Requirements Document
  • TN1.2 Verification and Validation Plan (Consolidated)

All above mentioned deliverables are to be approved at Model Readiness Review (MRR).

Task 6: Prototype tool development

Input: Output from Tasks 1-5

Task description

  • A suitable software shall be prototyped (or modified based on pre-existing developments) as a tool openly accessible via an API (Application Programming Interface) and properly documented, in view of being used by external users after the activity is completed.
  • Preliminary simulations of devices shall be performed, both in ground test and in-flight conditions, as input for the model verification and validation and for the finalisation of the experimental test plan.
  • The preliminary Verification and Validation plan TN1.2 shall be consolidated based on the information available from Task 4 and Task 5.
  • The final prototype SEE prediction tool shall be made available via the SPENVIS framework, by developing the user interface and related directives to the tool via the developed API.
  • After completing the above tasks including the delivery of the related documentation to the Agency, the Contractor shall organise a Model Readiness Review (MRR) according to Section 4.3.3

Output / Approval conditions:

  • TN6.1 Tool developments and simulation results
  • SW6.1 Tool for SEE prediction
  • SW6.2 SPENVIS interface to the SEE prediction tool
  • TN5.2 User Requirements Document, Software Design Document, Software Requirements Document (Updated)
  • TN6.2 Tool User Manual
  • TN1.2 Verification and Validation Plan (Consolidated)

All above mentioned deliverables are to be approved at Model Readiness Review (MRR)

Task 7: Model and tool verification and validation, experimental tests

Input TN1.2 Verification & Validation Plan (Consolidated)

Task description

  • The new methods and tools developed under this activity, shall be verified and validated against on-ground and in-flight data
  • Tests shall be executed in accelerator facilities with representative test subjects as from the consolidated Verification and Validation Plan.
  • Comparisons shall also be performed against methods from other tools and activities where appropriate, as identified in Task 2,
  • An abstract for a conference and a paper for publication in a peer reviewed journal shall be prepared and submitted to ESA for approval. Only after ESA approval shall the results be submitted.

Output / Approval conditions

  • TN7.1 Verification and Validation report
  • TN7.2 Experimental methods, results and detailed log files
  • Abstract for conference and paper

All above mentioned deliverables are to be approved at Final Acceptance Review (FAR)