Yesterday, a Communion celebration… and for my godchild no less. And as godfather, I was given the honor of preparing and reading a little text, with color as the theme. A nice mug of coffee, music in the background, and off to writing. But it turned out not to be so simple, those colors. Time to call in ChatGPT… what's the meaning of this color, and what color do you get when you mix it with another?
ChatGPT's answers can actually be called perfect: factual, well-organized, with source citations, suggestions… brilliant.
And not much can really go wrong here: colors are colors… any A.I. should be able to handle this just fine.
Or should it?
Over the past few months, I've consciously observed the use of ChatGPT in my surroundings, both privately and professionally. And in doing so, a number of things stood out that give pause for thought.
Widening of the digital divide…
… or rather, the emergence of a new, A.I.-specific divide.
According to a 2024 McKinsey study, roughly 79% of executives at large companies use AI tools like ChatGPT on a daily basis, compared to only 27% of operational employees in non-technical roles. OpenAI reported that in 2023, about 60% of ChatGPT traffic came from free users, while only 12% of active users had access to GPT-4 through a paid subscription. Yet these paying users achieved on average 2.8 times higher productivity gains.
As an executive, I too make use of a paid ChatGPT license. This allows me to fully leverage all the applications and really put the thing to work to save time or work smarter: data processing, translating entire presentations, generating images, converting audio to text, pulling text out of PDF documents, writing scripts, and so on.
The efficiency I gained from this is enormous and contributes to the quality of my work.
I caught myself increasingly expecting this 'win' from my team and the people around my team as well. "Why don't you do that with ChatGPT?" or "Have you tried having ChatGPT do it?" or "You're not still going to do that yourself, are you?" Until one of my employees gently pointed out that he didn't have access to a paid ChatGPT version, and so he simply couldn't accomplish half of what I could with ChatGPT — or only to a far lesser extent.
But the same realization hit me in my private life: my daughter asked me to proofread her English presentation about a fictional superhero. Nice presentation, a few mistakes here and there… and once again I asked my 13-year-old daughter why she hadn't had ChatGPT proofread her presentation… or even better: why not ask ChatGPT to just create a presentation (which she could then work on further herself, of course)… because the thing can do that perfectly. But here too, my daughter politely pointed out that this isn't possible in her 'free' version.
In my opinion, there is a genuinely emerging danger of a new digital divide in our society, but also in the business world. Most business owners and CEOs I know all have a paid ChatGPT version. And yes, they use it daily. And yes, my god, how much time they save with it? But I notice that they (consciously/unconsciously) also expect this 'modus operandi' from their employees. Or perhaps rather the speed and quality of output? But how can you achieve that as an employee if you don't have a paid ChatGPT version yourself? Are you supposed to handle this yourself? Should the employer provide it? What about private use of the (paid) ChatGPT version? And who, ultimately, is the owner of the account and the data and prompts stored in that account?
Everyone prompting?
But it's not just about the money. Once you start using ChatGPT a bit more intensively, you'll quickly notice that writing targeted, smart prompts (instructions) is crucial to getting out of ChatGPT what you want. Experience with and knowledge of queries and programming languages will certainly help here.
A study from MIT (2023) shows that employees without AI experience take on average 3 to 4 weeks to become productive with tools like ChatGPT. Experienced users, by contrast, achieve time savings of up to 40% on complex text and data tasks.
How many people in your organization have these skills, leaving the marketing and IT departments aside?
Differences in the learning curve and pace
Not everyone in your company is at the same point on the digital learning curve today. And some employees progress along it faster… or slower than others. One employee will learn on their own, while another needs a nudge or a training framework.
In Belgium, the Digital Economy and Society Index (DESI, 2023) showed that only 44% of the population has above-average digital skills. Only 15% feels competent in using AI tools at work.
So do you just unleash something like ChatGPT or CoPilot on your organization… or do you take a somewhat structured approach, with clear objectives, milestones, communication, and above all ownership?
Closing thoughts
In short, A.I. (in all its breadth) can be or become something very powerful for a company, and when implemented thoughtfully, it will definitely make a difference in day-to-day operations. But as with every aspect of digital transformation, there are a number of principles that, in my view, are sacred:
- Always start from a framework (strategy): what role and place do you want A.I. to take in your organization? What impact do you expect, and where/when/how do you want to see it?
- Communicate clearly, at the level of every employee, what you want to achieve, why, and what it can mean for the employee. Repeat the communication until it's clear to everyone, or whenever there are updates to share. Be consistent in your communication.
- Take away the fear! Especially in retail, the first pushback you typically get is the argument that A.I. (and digital more broadly) is going to take all the jobs away… which is of course nonsense. But again, you'll need to frame this clearly.
- Provide the necessary tools to your employees. How can you expect your company to hop on this train if your employees themselves can't ride along?
- Guide, support, and document. Whether it's an SAP transition, an A.I. implementation, activation of CoPilot… it usually goes wrong because after the implementation of a new tool, too little budget is set aside for training & coaching. Only 32% of EU companies actively invest in AI training for non-technical employees (Source: Eurostat, 2023).
- Promote your ambassadors. There are always a few 'fast movers' who jump on the train right away and even help speed it up. Put these people in the spotlight, give them room to evangelize, inspire others, and help out… in short, recognition.
- Evaluate & kill your darlings. The digital train is moving at breakneck speed. And you can hardly do anything other than match that speed. But from a train hurtling along at full tilt, sometimes something falls off, or a wheel breaks. It's no different with A.I. and digitalization, and especially with the implementation of new tools. There will be implementations that afterwards turn out not to deliver what was expected. "Fail forward" seems to be becoming more and more the approach to keep the train at speed. So yes… sometimes things will go wrong… can't fix it… kill it and start over. A Harvard Business Review survey shows that roughly 42% of AI implementations in companies are adjusted or discontinued after 12 months due to disappointing results or usage.
- Hold your horses. Yes, A.I. is coming, and coming hard, digital too. But not everything will be solved by A.I. People will continue to be the cornerstone of any organization for a long time to come. Possibly in a different role, with different tasks and an adapted work rhythm, but they're sticking around for a while. So don't go too fast either; don't give your people the feeling that they've become or are about to become redundant, because you might just end up with a nasty surprise.


