Best Sources — SST Infusion¶
Lessons learned across 23 sessions (2026-04-13 to 2026-04-14) about what data sources work for tracing SST technology downstream impact.
Last updated: 2026-04-14 (session 23)
TechPort¶
Strengths:
- technologyOutcomes field reliably captures Advanced_To/Advanced_From links between SST projects and follow-on missions (e.g., Starfish SSPICY concept → flight, OCSD → TBIRD lineage via relatedProjectId)
- libraryItems with attached PDFs and briefing decks are the richest source of technology substance — flight reports, closeout summaries, integration photos
- contacts field identifies PIs, enabling people-chain tracing
- program filter cleanly identifies all 111 SST projects
- portfolio_aggregate group_by leadOrg gives instant org footprint map
Weaknesses:
- technologyOutcomes populated on <30% of projects. Most SST outcomes are NOT recorded in TechPort structured fields
- No cross-agency tracking — SST→NSF transitions (VISORS, SWARM-EX) are invisible
- No commercial outcome tracking — acquisitions, product launches, DoD contracts not captured
- Library item quality varies wildly: some projects have detailed closeout PDFs, most have only a link to the generic SST website
- TRL fields are self-reported and sometimes stale (CLICK shows TRL 4 despite having flown)
Best practice: Always read every library item. The document reads are where 80% of infusion evidence comes from. Metadata alone produces a shallow KB.
USASpending¶
Strengths: - Best source for federal funding footprint. Dollar totals per company are demo-ready ("$52.8M across 19 awards") - Cross-agency visibility: shows NASA, DoD, AFRL, Space Force, DARPA awards for the same company - Award descriptions reveal contract purpose (e.g., "SBIR PHASE II - FRC MULTI-MODE THRUSTER") - Period of performance shows timeline of company growth
Weaknesses: - Recipient name matching is fuzzy — companies change names (Tethers Unlimited → ARKA → CACI), and name variants (MSNW LLC vs MSNW INC) split results - Small awards (<$100K) sometimes missing - No technology-level detail — awards are to companies, not to specific technologies
Best practice: Search by company name, then verify with web search for name changes/acquisitions. Always note total dollar amounts — they're the most compelling data point for external audiences.
NTRS (NASA Technical Reports)¶
Strengths: - Publication counts validate research impact (e.g., iSat produced 11 NTRS papers despite never flying) - TBIRD has 15 NTRS papers — highest for any SST project, reflecting its record-setting flight - Contract/grant numbers in paper acknowledgements link publications to TechPort projects - Author search reveals PI publication history and collaborator networks
Weaknesses: - Coverage is inconsistent — many SST projects have 0 NTRS hits even when publications exist (papers may be in SPIE, IEEE, AIAA proceedings instead) - CLICK-A has 0 NTRS hits despite a 2023 SmallSat Conference paper and SPIE papers — non-NTRS venues dominate for laser comms - Search relevance is sometimes poor — keyword searches return many false positives
Best practice: Search by author name AND by contract number. Contract numbers (e.g., 80NSSC19K0342) appear in paper acknowledgements and are more reliable keys than keywords.
SBIR.gov¶
Strengths: - Shows pre-SST SBIR history for companies, establishing technology lineage (e.g., Busek had NASA SBIRs before SST) - Cross-agency SBIR visibility (NASA, DoD, DOE)
Weaknesses: - API under maintenance as of April 2026 — queries unreliable - When working, firm name matching has same issues as USASpending
Best practice: Use as supplementary source when USASpending leaves gaps in the pre-SST funding timeline.
SEC EDGAR¶
Strengths: - 10-K and S-1 filings for public companies mention specific product lines and technology provenance - Acquisition 8-K filings capture dollar values and strategic rationale
Weaknesses: - Only useful for public companies. Most SST-funded companies are private (exceptions: Terran Orbital → Lockheed Martin via SPAC, Rocket Lab IPO, Redwire IPO) - Product-level granularity is poor in financial filings — hard to trace specific SST-funded technologies
Best practice: Use for acquisition-track companies. For the defense-prime acquisition pattern (Archetype #1), EDGAR confirms valuation and strategic rationale.
Web Search (press releases, news, mission records)¶
Strengths: - The #1 source for reclassifications. 8 of 10 reclassifications (false-negative corrections) were discovered via web search, not TechPort or USASpending - PI name searches reveal cross-agency transitions (SST→NSF, SST→DoD) invisible to any single database - Company press releases announce product launches, mission wins, venture funding - Mission databases (Gunter's Space Page, Wikipedia spacecraft lists, NASA mission pages) confirm flight heritage - SmallSat Conference proceedings (digitalcommons.usu.edu) are a rich source of flight results
Weaknesses: - Non-reproducible — search results change over time - Risk of confirmation bias — easy to find what you're looking for - No structured data — findings require manual interpretation
Best practice: Web search is essential, not supplementary. Every PI name from a no-visible-outcome project should be searched to check for hidden downstream. The VISORS and SWARM-EX convergences were found exclusively via web search. No other data source would have revealed them.
Cross-Source Patterns¶
Most productive investigation sequence: 1. TechPort → get project record + read documents 2. USASpending → get company federal footprint 3. Web search → check PI name + company name for downstream 4. NTRS → find publication count and key papers 5. SBIR.gov → check pre-SST lineage (when API working)
False-negative indicators (signs a "no-visible-outcome" project may be misclassified): - PI has multiple SST projects (serial funding suggests NASA saw value) - Company has >$10M USASpending footprint (company succeeded, maybe SST contributed) - Project TRL advanced ≥2 levels (technology matured, may have been adopted quietly) - PI moved institutions (knowledge transfer may have happened informally) - CubeSat Launch Initiative selected the technology for a flight (TechPort doesn't always capture CSLI) - SST→FO pipeline (session 20): University projects that transition to Flight Opportunities are invisible in TechPort unless you search the PI's name in other programs. This pattern produced 4 reclassifications (Purdue, Montana State, SDSU, UCLA) and is the dominant university transition pathway - IEEE/Optica Fellow PIs: High-stature PIs (Sharma IEEE Fellow, Vahala NAS/IEEE/Optica Fellow, Wong 180+ papers) tend to have downstream even when the SST project looked like a dead end
True negative indicators (high confidence the project had no downstream): - Project terminated for technical failure (Reaction Sphere, Aerojet Green Prop, Hyper-XACT) - TRL stayed flat (TRL 3→3 or 2→2) - Company has <$5M total federal footprint and no web presence - Academic project in thermal/power/sensors domain (83% ceiling rate)