GovCon Bid and Proposal Insights
GovCon Bid and Proposal Insights
Technical and Management Advisory Services (TMAS) 3 - Department of the Air Force
Explore the upcoming TMAS 3 contract—a $916M follow-on supporting the U.S. Air Force Test Center. Learn about the scope, key incumbents, evaluation criteria, and strategies to prepare for the October 2025 RFP.
Key Details:
•Contract Value: $916 M
•Estimated RFP Release Date: October 2025
Listen to podcast to position your business for success.
Contact ProposalHelper at sales@proposalhelper.com to find similar opportunities and help you build a realistic and winning pipeline.
Welcome to the Deep Dive where we take complex documents and well, turn all that information overload into those really satisfying aha moments, hopefully Right Today. We're cracking open a stack of federal contracting documents, specifically the Air Force's evaluation criteria for their technical and management advisory services, that's TMAS2 proposals.
Speaker 2:And we've also got some interesting Q&A from an industry day for the next iteration, TMAS3. Gives us a little peek into what's coming.
Speaker 1:Exactly so. It's a look behind the curtain, really.
Speaker 2:It really is. It shows you how the US government well, the Air Force in this case vets contractors for these really critical advisory roles. It's not just you know who says they're the best. These really critical advisory roles, it's not just you know who says they're the best. It's about who can actually prove they have this specific mix technical skill, solid experience and, crucially, realistic finances.
Speaker 1:Absolutely so. Our mission today is to kind of demystify these documents for you. We'll dig into the two big factors that can make or break a proposal. We'll uncover some honestly surprising details in their self-scoring system and there's some serious stakes there and even look at what happens if proposals end up tied. So yeah, let's unpack this.
Speaker 2:Let's do it.
Speaker 1:Okay, so at the heart of it all is this thing called a fair opportunity down selection process. Sounds fancy, but think of it like a really competitive multi-stage evaluation.
Speaker 2:Right, and the end goal is always to find the highest technically rated offer, the HTRO, with a realistic and reasonable price.
Speaker 1:Or RRP HTRO with RRP Got it.
Speaker 2:And what's really key here, I think, is how explicitly they talk about risk. They literally state that having a strong track record you know prior performance history in the key areas directly lowers the risk of failure.
Speaker 1:Makes sense.
Speaker 2:Conversely, if you have little or no history, well, you're seen as a higher risk. Simple as that. This isn't just paperwork. It's about protecting missions.
Speaker 1:Right Minimizing operational risk. So it's definitely not just about being the cheapest.
Speaker 2:Businesses are all.
Speaker 1:It's this balance and they weigh two big factors. Factor one, that's the contractor rating system, all about technical and management chops, and factor two is cost price. We're going to spend a good bit of time on factor one, first, because that's where you know the technical strengths and some really interesting strategic stuff comes out.
Speaker 2:Yeah, that's where the meat is, technically speaking, OK, now this part I find pretty clever.
Speaker 1:Contractors don't just say they're capable, they actually have to self-score themselves against all these criteria. But here's the catch. Yeah, what's the kicker?
Speaker 2:Every single point you claim, you need rock solid evidence. They call it a body of facts, data proof. If you can't back it up, it doesn't count.
Speaker 1:And what happens if your score isn't, you know, substantiated?
Speaker 2:Well, the government evaluation team validates everything. And here's the really tough part they only adjust scores down, they never bump you up. Ouch yeah. And if you have unsubstantiated or worse, misleading claims, even on just one thing, your whole proposal could be deemed unawardable Done.
Speaker 1:Wow, ice. Dace is right, don't fudge the numbers.
Speaker 2:Definitely not. Prove it or risk everything.
Speaker 1:So the proof comes from these work samples? Right, that's the evidence.
Speaker 2:That's the main evidence. Yes, you can submit up to five work samples. They have to be federal contracts or task orders.
Speaker 1:Okay.
Speaker 2:And they need at least six months of performance within the last five years and, critically, you need satisfactory or better CPRs. That's the Contractor Performance Assessment Reports on the latest assessment.
Speaker 1:CPRs Right. What if you don't have one?
Speaker 2:If CPRs aren't available for some reason, they do allow a past performance rating form as an alternative.
Speaker 1:Gotcha, and there are rules about who can claim what experience right Like prime versus subs.
Speaker 2:Oh, very specific rules For what they call category one. At least three of your five samples have to be from the prime offer directly.
Speaker 1:Okay, three from the prime.
Speaker 2:Although one of those three can be work they did as a subcontractor. So a little flexibility there, right. Then there's category two, which lets you use samples from your team members. Your proposed sucks.
Speaker 1:Conditions apply.
Speaker 2:Big time. The team member has to be doing at least 10% of the actual work on the new contract and you need a signed teaming agreement up front and you can only use a maximum of two category.
Speaker 1:Two samples so it encourages teaming, but the prime really has to carry the weight of the experience.
Speaker 2:Exactly, the onus is definitely on the prime contractor Okay.
Speaker 1:Moving beyond just the technical ability.
Speaker 2:They also look at your organizational strength right how stable and capable you are overall. Absolutely, factor one includes these general capabilities. Let's start with total positions. Okay, this is basically about scale. They want to know the maximum number of filled positions you managed on your single largest prime contract in the last five years.
Speaker 1:So it shows you can handle a big workforce, something similar in size to the new contract.
Speaker 2:Precisely, it's about proving capacity, scalable capacity.
Speaker 1:And it's not just how many, but where you manage them. That leads to geographically separated work locations.
Speaker 2:Right Managing distributed teams is key, and what's really interesting here is how the requirement changes depending on the Air Force unit.
Speaker 1:Oh, yeah, how so.
Speaker 2:Well, the minimum number of CMEs contractor man-year equivalents per location varies. For the 412th test wing and AEDC it's just one CME per location.
Speaker 1:OK, so maybe just one key person somewhere.
Speaker 2:Could be, but for the 96th Cyberspace Test Group it jumps to 10 CMEs minimum per location, and for the 96th test wing it's four CMEs.
Speaker 1:Wow, 10 CMEs. That's a whole team.
Speaker 2:Exactly and all locations have to be at least 100 miles apart, and again demonstrated on a prime contract. It really shows they've tailored this to the specific operational needs. Some need wide distribution, others need substantial teams in multiple places.
Speaker 1:That makes sense. Okay, next is incumbent positions transitioned. This sounds important.
Speaker 2:It's huge. It's about showing you can take over smoothly defined as hiring folks who worked for the previous contractor within 90 days on a single prime work sample.
Speaker 1:And how important is it?
Speaker 2:So important. It's the number one tiebreaker if scores are identical for all the units.
Speaker 1:Whoa Okay, so continuity is king.
Speaker 2:Absolutely Minimizing disruption is clearly a massive priority for them.
Speaker 1:Then there's the flip side Non-income and personnel hired.
Speaker 2:Right. This measures how fast you can bring in new talent, specifically hiring, engineering, support or cyber engineering. For the 96 CTG within just 30 days. Again, on a single prime sample.
Speaker 1:So keep the old team happy, but also be ready to staff up quickly with new people.
Speaker 2:Exactly Stability and agility.
Speaker 1:Okay, what about money? Financial stability.
Speaker 2:Yeah, this isn't just are you profitable? It's about having accessible funds or a line of credit ready to go equal to 25% of the base period cost 25%.
Speaker 1:Why so specific?
Speaker 2:They actually say why in the documents. It's to cover potential delays in government payments, maybe invoice errors, maybe even government furloughs. They want you to be able to operate for about the first three months without sweating payment.
Speaker 1:So it's a cushion risk mitigation again.
Speaker 2:Exactly, can you weather a short storm?
Speaker 1:Makes sense. And finally, security clearances, always critical.
Speaker 2:Paramount, especially for the 412th test wing, where they specifically need proof of managing personnel with secret SAP access and top secret SCI access.
Speaker 1:SAP and SCI High level stuff.
Speaker 2:Very. For the other units 96 CTG, 96 TW, aed, dc it's focused on secret or above, and this clearance management capability it's often the third tiebreaker, though interestingly for the 412th it's actually the fourth. So yeah, if you pull back and look at all these general capabilities together, it paints a picture of what they value most Reliable execution, managing risk effectively and ensuring operational continuity, especially in these really sensitive areas. It's about building trust.
Speaker 1:Okay, so that covers the general stuff, but then each unit has its own very specific technical needs, right? This is where the mission details come in.
Speaker 2:Oh yeah, absolutely, and the level of detail is pretty striking. You really see the unique focus of each organization.
Speaker 1:So give us some examples, like for the 412 test wing.
Speaker 2:Okay 412, big focus on program management advice. Things like theory of constraints, critical chain, project management. These are specific advanced scheduling methods, plus, obviously, flight test and air vehicle management Right. They also need deep expertise in range and instrumentation, specifically mentioning real-time data processing and wireless instrumentation, and extensive electronic warfare experience, even down to things like anechoic chambers and open air ranges. A lot of these need proof from contracts with at least 25 CMEs. So sizable efforts.
Speaker 1:Okay, 25 CMEs. What about the cyber guys? The 96th Cyberspace Test Group.
Speaker 2:Ah, 96 CTG. It is all about advanced cybersecurity engineering support. We're talking really complex stuff like support for C4 ISR systems, command control, comms computers, intel surveillance, reconnaissance across the whole battle space arena and kill chain. Wow, yeah, and that specific one requires experience on a contract with at least 125 CMEs Huge scale. They also need experience testing offensive and defensive cyber programs, penetration testing for avionics, mission planning systems, even space situational awareness testing. The CME requirements vary but yeah, 125 CMEs for that C4 ISR one is significant.
Speaker 1:Definitely shows the scale of cyber defense. Okay, how about the 96th test wing? Are they different?
Speaker 2:96 TW is broader in some ways. Focus on test and evaluation, program management and general systems engineering. Also big on instrumentation and measurement and range in instrumentation. They need the full life cycle of flight, indoor ground testing, experience planning, execution, analysis, reporting, plus information management, modeling and simulation, general cybersecurity support and specific experience testing air-to-ground and air-to-air munitions and different aircraft types. Most of these need proof from contracts with at least 50 CMEs 50 CMEs still substantial.
Speaker 1:And finally, aedc, arnold Engineering Development Complex.
Speaker 2:AEDC is where you see some really specialized, almost unique requirements. Things like direct technical engineering support on DOD, rdt&e ground test, research, development, test and evaluation. More ADKD Covering areas like air breathing, propulsion testing, hypersonic vehicle testing, including propulsion, high temp effects, materials, boundary layer analysis, super complex physics, Hypersonics.
Speaker 1:yeah, that's cutting edge.
Speaker 2:Definitely. They also value experience working in multifunctional teams, military, government, civilians, other contractors and supporting specific test organizations like an LDTO or ETO Plus. Testing for space, environmental effects, rocket propulsion, multispectral signatures, wind tunnels across all speed regimes, even captive trajectory testing of weapons.
Speaker 1:Not that trajectory, like holding a missile under a plane in a wind tunnel.
Speaker 2:Basically, yeah, testing its aerodynamics before it's actually fired. What's interesting with AADC is some technical subfactors have a none listed for source type restriction, meaning for those specific, perhaps very niche, skills. They weren't initially limiting it to only large contracts, maybe broadening the search for rare expertise, although still federal contracts.
Speaker 1:Interesting. So, across all these units, what else stands out technically?
Speaker 2:Well, one thing that pops up everywhere is the Defense Security Services Vulnerability Assessment Rating, DSS rating. It's a direct measure of how well you handle classified stuff.
Speaker 1:Makes sense.
Speaker 2:And those CPRs we mentioned earlier quality, schedule, cost control, management compliance. They are absolutely foundational, so much so they're the second tiebreaker across the board. Past performance is clearly critical.
Speaker 1:Okay, so factor one is done. Technical scores are validated. Okay Now factor two cost and price. Yeah, this isn't just lowest bidder wins, right? You mentioned traps.
Speaker 2:Definitely not lowest bidder and, yes, potential traps if you're not careful. They have this rigorous two step process to see if your price is both realistic and reasonable.
Speaker 1:OK, step in.
Speaker 2:Step one they analyze your individual labor rates, direct and indirect, against government benchmarks. They calculate what they call a most probable cost, or M MPC, using data from Bureau of Labor Statistics, dcaa audits, things like that. Okay, they have their own number in mind Pretty much, and if your proposed rates are more than 5% below their MPC or more than 10% above it, they can adjust your price upwards or downwards for evaluation. That flags a potential issue right there 5% under 10%, over Tight margins.
Speaker 1:What's step two?
Speaker 2:Step two gets even trickier. For each labor category needed, they look at your entire proposed team, you and all your subs, and find the single highest fully burdened labor rate proposed by anyone on the team for that category.
Speaker 1:The highest rate on the whole team.
Speaker 2:Yep and they use only that highest rate to calculate the total cost for everyone in that category. They just ignore any lower rates proposed by you or other subs for the same job.
Speaker 1:Wow. So if you have one really expensive specialist sub, their rate gets applied to everyone in that category for the cost calc.
Speaker 2:For the government's evaluation cost calculation, yes, and if the resulting MPC is significantly different from what you proposed, it signals to them you might not understand the work or the labor market Red flag.
Speaker 1:So bottom line, if their calculations result in an upward adjustment of 5% or more to your proposed cost, or if your price is just deemed too high, maybe 10% over their MPC.
Speaker 2:It could very well be deemed unrealistic or unreasonable and your whole proposal gets rejected. Even your technical score was amazing.
Speaker 1:That is a very tight window and there are hard caps too right.
Speaker 2:Absolutely unforgiving. Hard caps. Transition costs can exceed $50,000, period and your overall fee rate can exceed 5.00%. Go over either even by a penny. You're ineligible.
Speaker 1:Out, regardless of anything else.
Speaker 2:Regardless, and you know you could be within those caps out regardless of anything else, regardless. And you know you could be within those caps, but if your pricing strategy makes your overall evaluated price seem unreasonable compared to their MPC, you could still get rejected and they also warn about unbalanced pricing right. That's where you might say lowball one area and make up for it by overpricing another. They see that as an unacceptable risk and can reject you. For that too, too, need a consistent, realistic strategy.
Speaker 1:OK, let's circle back quickly to those tiebreakers. You said they're like a secret decoder ring.
Speaker 2:They really are. They show you exactly what the Air Force prioritizes when it comes down to the wire between two equally scored proposals, the order matters. So lay it out for us, number one Number one universally Highest number of CMEs successfully transitioned from the incumbent within 90 days.
Speaker 1:Smooth handoffs are clearly priority one. Okay, number two.
Speaker 2:Number two also universal Highest total score on your combined CPI reports. Consistent high quality pass performance across the board.
Speaker 1:Makes sense. Number three you said this varies.
Speaker 2:It does slightly. For the 412th TW, it's the highest percentage of positions filled quickly within 30 days of vacancy. For the other three units 96 CTG, 96 TW and AEDC it's the highest number of personnel with secret or higher clearances.
Speaker 1:Interesting distinction Speed versus cleared personnel count.
Speaker 2:Reflects maybe slightly different operational pressures or needs.
Speaker 1:Okay, and number four.
Speaker 2:Number four is about those geographically separated locations again Highest number of separate locations you've performed work at. But remember those different CME minimums. They apply here too. One CME minimum for 412th and AADC ties, 10 for 96 CTG, 4 for 96 TW.
Speaker 1:Wow, even the tie breakers are tailored.
Speaker 2:It really underscores how specific their needs are, unit by unit.
Speaker 1:Okay, and looking ahead just a bit, that TMAS 3 industry day Q&A. Any juicy bits?
Speaker 2:A few interesting confirmations and hints. The overall structure is expected to be very similar to TMAS 2, so everything we've discussed is likely still relevant. Good to know. They clarified teaming. Yes, you can prime one task order and sub on another, but no cross-teaming on the same task order keeps teams distinct. They also explicitly said there are no critical subs the government would mandate, so offers have to build their own best team. And for CPRs it's strictly the last five years of performance that counts, even on longer contracts and NA ratings. Just ignored, not penalized.
Speaker 1:Right, and you mentioned something about NASA.
Speaker 2:Yes, that was interesting. Specifically for ADC, there was openness expressed to potentially considering non-DOD federal contracts, like NASA experience for some of the highly technical subfactors.
Speaker 1:Really that could broaden the pool.
Speaker 2:It could. It suggests they might be willing to look at comparable complex engineering environments outside pure DoD work. If the offer makes a really strong case for relevance, it hints at potential future flexibility.
Speaker 1:That is a powerful insight. Wow, I think we have covered a lot of ground here. We really have samples. Through cost realism gymnastics, to those subtle differences and tiebreakers, you can really appreciate the immense amount of strategic thinking that goes into picking partners for these vital services.
Speaker 2:Absolutely. It's a very rigorous, multi-layered process. It's designed to find contractors who don't just have the technical chops but also the operational maturity, the financial stability, the integrity, really, really to deliver when the stakes are incredibly high.
Speaker 1:Yeah, understanding these criteria, it's like a masterclass in how government agencies define and assess value, isn't it?
Speaker 2:It really is.
Speaker 1:What jumps out at me is how much they stress not just what you did but how well you did it and your proven ability to handle the specific tough challenges of a new job. So for you listening, hopefully this deep dive gives you a unique window into this meticulous world, showing that success really hangs on, proving you understand the mission and the market inside and out.
Speaker 2:Well said, and you know it brings up a big question for the industry, doesn't it? In this kind of environment super high technical bars, but also incredibly tight scrutiny on cost realism how do you keep innovating, how do you stay agile and competitive while also staying compliant? It's a constant tightrope walk.
Speaker 1:It really is a constant balancing act. So here's a provocative thought maybe to leave you with. If you were putting together a proposal knowing all of this, how would you strategically shape it to truly stand out as that ideal, highest technically rated offer with a realistic and reasonable price, Something that you want?
Speaker 2:Definitely something to think about.
Speaker 1:That's all for this deep dive. We hope you found these insights well insightful. Until next time, keep digging for knowledge.