Dr. Ryan Ries back again with this week’s Mission Matrix. We’re back to our normal weekly style with news, updates from Mission, AWS, and the industry, with the occasional rant.
Last week I promised that I would post two industries as blog posts. I may have been a bit over ambitious thinking I would get two done in a week. So, bear with me, they’re not ready yet but I will have them up over on our blog soon and will update you here when they’re live.
Before we get started, a couple of upcoming events:
- Consolidating your contact center tech stack is an issue a ton of businesses are working through. Our Amazon Connect expert is hosting a live session next week on two real customer stories where consolidation had a huge positive impact on their business.
- Houston locals: We’re hosting an event on May 7th with CrowdStrike all about threat intelligence and managing + securing your cloud environment. Register here.
Now, for this week’s Matrix, I wanted to talk about a few different news stories in the form of “lessons” around AI. Particularly one story had me laughing out loud.
Lesson 1: Build with portability in mind
Sora is OpenAI's text-to-video model that made Hollywood nervous, then excited, and now… it’s gone.
Disney had signed a three-year deal with OpenAI around Sora and this partnership was seen as a big moment for how AI and entertainment might coexist.
Then OpenAI shut Sora down and I read a report that Disney found out less than an hour before the public announcement. Wild!
There's a hard infrastructure lesson here for anyone building AI-dependent products or workflows. You don't own the model and you don't own the road. When the vendor changes direction, your roadmap changes with it.
Lesson 2: Trust But Verify, Always
I swear I'm not just picking on OpenAI this week but this one was too good to skip.
A TikTok creator called Husk AI asked ChatGPT's voice mode to time a mile run. He started, ran for a few seconds, stopped, and asked for the time. The AI said he ran for over ten minutes. He pushed back. The AI pushed back harder, insisting he was the one who was wrong.
Sam Altman was shown the clip during a podcast interview. He laughed awkwardly for a beat too long, called it a "known issue," and estimated it might take another year to fix.
The confidence gap is the real story here. The model had no timer capability whatsoever, something I guess you would’ve reasonably assumed it would’ve had.
It fabricated a time with total confidence then doubled down on the fake time when challenged. If you go to Husk AI’s TikTok channel, this is just one example of AI confidently asserting wildly wrong information.
That behavior is a pattern in how many large language models handle uncertainty. They don't say "I don't know." They perform knowing.
For anyone deploying AI in business-critical workflows: validation layers are SO IMPORTANT.
Lesson 3: Avoiding Lock-In At Every Layer of Your Stack
Matt Garman articulated in a recent interview about how AWS approaches the AI stack. Their goal isn't to pick a single winning model or a single winning chip.
It's to give customers access to the best technology across every layer.
Anthropic and OpenAI on the model side. Trainium, Graviton, Nvidia, AMD, Intel on the silicon side.
The interview also asked Garman about AWS building its own chips (Graviton and Trainium) while simultaneously maintaining deep partnerships with Nvidia, Intel, and AMD.
This is something I’ve heard a lot of people mention about AWS’s chips: aren't you competing with the very companies you rely on?
Garman said this dynamic has existed since the beginning of AWS — they've always built their own compute while partnering with others, because customers don't want one option. They want the best option for their specific workload.
He pointed to Graviton as a perfect example: wildly popular with customers, hugely impactful on performance and cost, and yet AWS is still a major Nvidia customer. Both things are true at the same time.
The best AI infrastructure strategy is the one with the fewest regrettable architectural decisions two years from now.
And with that, if you’re interested in building out a use case for your business, reach out to our sales team here. We’ve built over 250+ AI projects at this point, so we’re well-equipped to make sure your AI is built with longevity and security in mind.
Until next time,
Ryan
Now time for this week’s AI-generated image and the prompt I used to create it. I think this just goes to show when working with a model that isn’t connected to some RAG database - it doesn’t have the right teams playing!
Create an image of a muppet in Indianapolis for the Final Four NCAA championship. The muppet is courtside at the game sitting next to celebrities like Jon Hamm, Bad Bunny, Bill Murray, and Miley Cyrus.