Skip to main content

Spacecraft Charging Simulation Environment

Project Information

parallelization, Simulations, Astrophysics
Project Status: New and Recruiting
Project Region: Northeast
Submitted By: Northeast Cyberteam
Project Email: justin@ema3d.com
Project Institution: Electro Magnetic Applications, Inc
Anchor Institution: NE-MGHPCC
Project Address: Pittsfield, Massachusetts

Mentors: Julie Ma, Justin McKennon

Project Description

Currently the tools that exist to analyze the impact of the space environment on materials fall into three separate and distinct time frames: short, medium and long term impacts. Many of the existing tools were developed by NASA over 20 years ago, and there are many operational drawbacks to the currently available codes. There is a significant desire to have controllable fidelity in spacecraft charging related models, in order to allow for flexibility in how the end-users deploy this technology. EMA seeks to develop and incorporate these simulation tools into its platform, and in parallel with its space environmental effect chamber, overhaul the outdated material characterization databases with new and novel material values, and break the barriers to entry for users performing spacecraft charging simulations. These will prove invaluable to the participants in the space industry.

---

Due to the nature of the Space Environment, it is necessary to update the current tools that exist, and link each time frame together, so the variables and the interdependencies between each is clearly defined and the output can be calculated accordingly. While linking the tools, it is also possible to improve the performance and efficiency of the calculations by updating and streamlining the programming algorithms and code used in each application/situation. Each simulation needs to be capable of being coupled or run independently from one another, in any permutation.
The end product will support one or multiple coupled time scales, in 1D, 2D, or 3D geometries. The codes will be event based, meaning, that the sims (simulations) will call on one another to update/perform calculations when certain criterion are met. This will allow for the sims to be integrated or independent. The codes will solve for yield, internal charging, polarization, and other variables relevant to the space environment.

The length of time that the sims take to run is entirely dependent on the complexity of the simulation being run. Our end goal is to have 1D, 2D, and 3D geometries able to be solved, and each uses a different set of equations. The more dimensions, or the more “advanced” the equation, the more time. These can take hours or days depending on complexity and the length of the simulations. We’d like there to be flexibility in how portable the program is. Many users will do simple models from reasonable laptops/desktops. Much more advanced sims would greatly benefit from parallel processing because each of the three main simulations runs independently, and “calls” other sims. Meaning, it’s a great candidate to benefit from HPC. Right now, our solvers are in FORTRAN, but they’re all being ported to C++. Our user interface is C#.Net with SpaceClaim (an Ansys tool) API call. Meshing is C++, and I expect these codes to be C++.

The proposed MGHPCC/ Northeast Cyberteam/EMA Pilot Project is a large part of a National Science Foundation (NSF) SBIR proposal EMA is currently working on (due to be submitted by June 13). Regardless of the outcome of the NSF SBIR proposal, EMA will be pursuing the MGHPCC and Northeast Cyberteam proposal regardless, but the NSF SBIR is our preferred path at the moment. My expectation is that Phase II would be where we’d look to bring in multiple student interns (2-5, depending on how things shake out). We envision the need for 1-2 student interns for Phase I. I expect Phase I to be awarded sometime in the fall. The student interns would be working through EMA as an “intern”, etc.

The project itself will involve three simulation codes that need to interface with our CAD interface. We have internal expertise for the CAD interfacing. So, the students would receive data in some format, and have to apply and evolve the equations based on what’s selected. This would be a great place for the students to see how the code runs normally, and help us figure out the best way, and the best pieces to optimize for HPC. Not all of it will be applicable, but some certainly will. Phase I is a “proof of concept” in the eyes of the NSF, and Phase II is extremely competitive. We’d leverage the students as needed in Phase I, but we need to understand the financial piece of that first.


Project Milestones:
I. Scope Definition
II. Review of existing tools
a. Identification of areas
i. Retain
ii. Scrap
iii. Retain and Update
iv. Points, variables, or code which will be linked to the other tools
III. System Design Session
IV. Coding
V. Testing
VI. Launch and Implementation

Project Information

parallelization, Simulations, Astrophysics
Project Status: New and Recruiting
Project Region: Northeast
Submitted By: Northeast Cyberteam
Project Email: justin@ema3d.com
Project Institution: Electro Magnetic Applications, Inc
Anchor Institution: NE-MGHPCC
Project Address: Pittsfield, Massachusetts

Mentors: Julie Ma, Justin McKennon

Project Description

Currently the tools that exist to analyze the impact of the space environment on materials fall into three separate and distinct time frames: short, medium and long term impacts. Many of the existing tools were developed by NASA over 20 years ago, and there are many operational drawbacks to the currently available codes. There is a significant desire to have controllable fidelity in spacecraft charging related models, in order to allow for flexibility in how the end-users deploy this technology. EMA seeks to develop and incorporate these simulation tools into its platform, and in parallel with its space environmental effect chamber, overhaul the outdated material characterization databases with new and novel material values, and break the barriers to entry for users performing spacecraft charging simulations. These will prove invaluable to the participants in the space industry.

---

Due to the nature of the Space Environment, it is necessary to update the current tools that exist, and link each time frame together, so the variables and the interdependencies between each is clearly defined and the output can be calculated accordingly. While linking the tools, it is also possible to improve the performance and efficiency of the calculations by updating and streamlining the programming algorithms and code used in each application/situation. Each simulation needs to be capable of being coupled or run independently from one another, in any permutation.
The end product will support one or multiple coupled time scales, in 1D, 2D, or 3D geometries. The codes will be event based, meaning, that the sims (simulations) will call on one another to update/perform calculations when certain criterion are met. This will allow for the sims to be integrated or independent. The codes will solve for yield, internal charging, polarization, and other variables relevant to the space environment.

The length of time that the sims take to run is entirely dependent on the complexity of the simulation being run. Our end goal is to have 1D, 2D, and 3D geometries able to be solved, and each uses a different set of equations. The more dimensions, or the more “advanced” the equation, the more time. These can take hours or days depending on complexity and the length of the simulations. We’d like there to be flexibility in how portable the program is. Many users will do simple models from reasonable laptops/desktops. Much more advanced sims would greatly benefit from parallel processing because each of the three main simulations runs independently, and “calls” other sims. Meaning, it’s a great candidate to benefit from HPC. Right now, our solvers are in FORTRAN, but they’re all being ported to C++. Our user interface is C#.Net with SpaceClaim (an Ansys tool) API call. Meshing is C++, and I expect these codes to be C++.

The proposed MGHPCC/ Northeast Cyberteam/EMA Pilot Project is a large part of a National Science Foundation (NSF) SBIR proposal EMA is currently working on (due to be submitted by June 13). Regardless of the outcome of the NSF SBIR proposal, EMA will be pursuing the MGHPCC and Northeast Cyberteam proposal regardless, but the NSF SBIR is our preferred path at the moment. My expectation is that Phase II would be where we’d look to bring in multiple student interns (2-5, depending on how things shake out). We envision the need for 1-2 student interns for Phase I. I expect Phase I to be awarded sometime in the fall. The student interns would be working through EMA as an “intern”, etc.

The project itself will involve three simulation codes that need to interface with our CAD interface. We have internal expertise for the CAD interfacing. So, the students would receive data in some format, and have to apply and evolve the equations based on what’s selected. This would be a great place for the students to see how the code runs normally, and help us figure out the best way, and the best pieces to optimize for HPC. Not all of it will be applicable, but some certainly will. Phase I is a “proof of concept” in the eyes of the NSF, and Phase II is extremely competitive. We’d leverage the students as needed in Phase I, but we need to understand the financial piece of that first.


Project Milestones:
I. Scope Definition
II. Review of existing tools
a. Identification of areas
i. Retain
ii. Scrap
iii. Retain and Update
iv. Points, variables, or code which will be linked to the other tools
III. System Design Session
IV. Coding
V. Testing
VI. Launch and Implementation