Post

No Space For Error

By Jason Gilligan, Content Specialist at Pype

When NASA announced the Artemis Program last month, I was ecstatic! The program would not only see us following in the footsteps of the Apollo program as we return to the moon, but also calls for us to establish a semi-permanent base on the lunar surface, supported by a “Gateway” space station in lunar orbit. The little kid in me freaked out, the sci-fi nerd in me was excited, and the Greco-Roman History nerd in me was very pleased they named the program after Apollo’s twin sister.

As I kept up with the news surrounding the Artemis Program, I began seeing articles on the contracts NASA was awarding for different systems and technologies. Unlike the Apollo Program, a vast number of Artemis’ pieces would be developed privately in partnership with NASA. As of writing this, NASA has chosen companies for lunar landers, the Gateway’s power and propulsion system, and a whole host of companies for the Space Launch System rocket, Orion spacecraft, and Exploration Ground Systems programs. And that’s not even the tip of the iceberg for all of the technology that needs to be developed to make Artemis a reality.

Unsurprisingly, it is very, very difficult to keep humans alive in space. Whatever technology NASA uses to keep the astronauts safe has to endure an incredibly rigorous testing process. And, as I was writing an article about evaluating technology, I decided to look up that process.

NASA’s Technology Readiness Level (TRL) is their nine-step system for assessing a technology’s, well, readiness. The nine levels in this system are:

  1. Basic Principles Observed and Reported
    Someone noticed something cool and thought “Hey, that’s neat. I wonder if we could use that.”
  2. Technology Concept/Application Formulated
    “We could use that to accomplish this task, probably.”
  3. Analytical/Experimental Proof of Concept
    “Yes, we can definitely use that to accomplish this task.”
  4. Component Validation in Laboratory Environment
    “That can work with these other things without screwing it all up in this very specific and controlled situation.”
  5. Component Validation in Relevant Environment
    “That can work with these other things without screwing it all up in the actual conditions it needs to perform in.”
  6. System/Subsystem Model/Prototype Demonstration in Relevant Environment
    “This system works!”
  7. System Prototype Demonstration in Space Environment
    “This system works in space!”
  8. Actual System Completed and “Flight Qualified” through Test and Demonstration:
    “This system was launched on a rocket into space and it still worked!”
  9. Actual System “Flight Proven” through Successful Mission Operations
    “We used this system in space on the actual task and accomplished the actual task!”

I’m no engineer, but I think “it accomplished the task” is a pretty solid goal to have, one that we should all strive towards in our approval processes. NASA’s TRL system could serve as a pretty solid foundation for anyone evaluating any solution, even if the stakes aren’t nearly so life-and-death. And, while it’s probably not necessary to have your AI software work in space, it always pays to be prepared.

Jason Gilligan

A newcomer to the construction tech industry, Jason has a background in content creation and brings a fresh perspective to Pype. A graduate of George Mason's film department, Jason uses both written and visual mediums to share information.

Connect with Jason on LinkedIn.

Related Posts

No pressure, let our demo do the talking.

Reach out to us and we’ll show you how to shave 40 hours off of your workflow.