AI round-up: Week of November 18, 2024

Next week is Thanksgiving. Does anyone remember what happened around this time last year?

Sam Altman was removed and reinstalled as CEO of OpenAI. (November 17-19, to be exact)

Does anyone remember what happened two years ago around this time? ChatGPT was launched (November 30, to be exact.)

What will happen this year? Who knows? But what I do know is whatever it is, we won’t cover it until the week of December 2nd because we’re off next week! (Hope you are able to find some time to ease into and enjoy the holiday!)

Don’t worry. Like turkey leftovers, we have plenty of news this week to last us a while.

the BIG five

Microsoft’s Copilot: From Hype to Hurdles

When Microsoft launched Copilot last year, we were promised a tech revolution. Remember that first video it put out? Where do I sign up??

Fast-forward a year, and the reality is now more complicated and convoluted. From sky-high investments to customer frustrations and rivals circling like hawks, this Business Insider exclusive dives into the growing pains of Copilot and what it means for the future of AI-driven productivity tools. It’s a longer read, but it's worth it.

“Operator, oh, could you help me place this call.”

Ah damn. No one did it like Croce.

So why am I putting Jim Croce's songs in your head this morning? Well, OpenAI is getting ready to introduce … wait for it … no, not ChatGPT5 … but Operator.

Operator will allow ChatGPT to take control of tasks on your behalf. Tasks like booking flights, for example.

Now, I’m not going to spend time talking about the difference in agents, specifically where Operator fits on that scale, because Paul Roetzer does a great job of breaking it down. We’ll get to that in the ‘Learn a little’ section.

What Sets AI-Driven Companies Apart

A new piece from HBR dives into what makes the top 10% of AI-driven companies different from the rest. Spoiler: It’s not just about the tech—it’s about embedding AI into strategy, breaking down silos, and focusing on experience over efficiency. The article highlights practical ways companies are using AI to transform both customer and employee outcomes, proving that people, not just algorithms, are key to scaling AI successfully. Where do you see yourself and your company falling in that conversation?

Who owns you? Well, your digital ghost anyway.

Christopher Penn does it again.

He tackles a question that I’ve been asked a lot lately: Who owns your digital twin? Can a company retain your knowledge and build/train a ‘you’ to do the work you can or were doing?

Does anyone else think labor law may change a bit over the next few years?

This article is worth your time … if you want to know who ‘you’ll’ be working for in the future.

Coca-Cola

I had to include this here, not because of the ad itself (which you can see here) but because of the backlash and reaction.

Um, where ya been??! This isn’t Coca-Cola’s first AI venture, and they haven’t hidden the fact that they’re going to use AI in their ads. (Which we’ve also talked about here.)

We are going to see more and more of this as people start to see the output of all the things we’ve been talking about for nearly two years. It’s why I never understood the lack of attention or interest in AI storylines. Companies weren’t investing in this for the heck of it – they wanted to use it. And that moment some people felt would never arrive – or not for a long time at least – is here.

And now we have to deal with it. (Once we’re done shaking our fists in the air.)

Learn a little

We’ve heard a lot about agents these past few weeks. So, of course, it’s clear what they are and what they mean for the future of work, right?

Right.

Thankfully Paul Roetzer broke it down in a post and in the newest episode of The AI Show.

From Paul (LinkedIn)

AI agents are seemingly everywhere all of the sudden. Anthropic, Google, HubSpot, Microsoft, Nvidia, OpenAI, Salesforce, ServiceNow, Writer and many other leading tech companies have centered their product roadmaps and marketing campaigns on AI agents (or agentic AI).

But what, exactly, are AI agents?

The simple definition is “an AI system that can take actions to achieve a goal.” This is an evolution of the generative AI tools you’ve been used to.

Large language models (LLMs)—like those that power ChatGPT, Claude and Gemini—answer questions and create outputs by predicting tokens or words.

Super valuable, but it still leaves humans to do the work in a multi-step process such as making online purchases, booking flights, scheduling appointments, planning meeting agendas, sending email campaigns, completing digital forms, and creating business strategies.

The promise of AI agents is that they will autonomously perform these actions, freeing humans up from mundane and repetitive tasks.

In theory, the human gives the AI a goal, and the AI flawlessly plans and executes the actions needed to reach the desired outcome. (In an autonomous car analogy, it would be like removing the steering wheel.)

But the reality is that the AI agents we have today are largely not autonomous, despite how some technology companies may be defining and positioning them.

Humans are needed to set goals, plan and design the agents, connect the data sources and supporting applications/tools, oversee execution for accuracy, provide inputs to iterate and improve them, and analyze performance.

That doesn’t mean AI agents aren’t enormously valuable and potentially very disruptive to the future of work.

It just means not to believe the hype when you hear about autonomy in AI agents. The only thing autonomous in most cases is the execution of actions, but even that requires human oversight in the majority of applications.

Did you hear…

…ChatGPT may be able to ‘see’ soon? (Digital Trends)

…Nvidia’s Blackwell AI chips are overheating? (PC Mag)

…”Seniors” are turning to AI for companionship. Why Nana’s best friend may be a robot. (New York Post) Ok, as someone who is very, very close to 50 … I have to take some umbrage with the fact this article uses the word seniors in the headline … then opens the article by talking about people ‘over 50.’ What? Can we retire the word ‘seniors?’ When did it drop to 50+???

…Trump isn’t likely to prioritize or pursue AI regulation? And the AI industry is here for it. (Bloomberg)

…OpenAI has released a teacher’s guide on how to integrate ChatGPT into the classroom, grades K-12. (TechCrunch)

My favorite story:

No, AI Jesus isn’t actually hearing confessions: fact check (Catholic News Agency)

Must read/must discuss:

“AI won’t take your job. Someone who knows how to use AI will.”

Is it time to move on from this quote? It’s served its purpose, no?

Listen, the quote (still) isn’t wrong. Well, not entirely. It had staying power for a good year or so, but I feel we’re now into a one-word variation of that quote:

“AI won’t take your job. Someone who knows how to use AI well will.”

We’ve moved past the idea of using AI to ‘make stuff’ and we now are starting to see that people using AI simply isn’t enough. It’s what they do with the knowledge. The categories are still as follows:

1.    Don’t use it

2.    Use it

3.    Use it well

People will continue to publish reports and write articles about job loss and more – and they’re not entirely wrong. In theory, anyway. But the time has come to live and work in the world the last two years have built – a world where AI skills are table stakes, and your path and success will be based on how well you use them.

The lessons, tools and examples have been all around us these last two years. And for the most part, FREE to us. At a certain point the focus has to shift away from convincing and educating the first group … and instead, focus on leveling up the next two groups. I know what group I want to be in.

Are you ready for what’s next?

-Ben

As a reminder, this is a round-up of the biggest stories, often hitting multiple newsletters I receive/review. The sources are many … which I’m happy to read on your behalf. Let me know if there’s one you’d like me to track or have questions about a topic you’re not seeing here.