- An Xbox executive suggested that fired employees use AI for emotional support and professional orientation.
- The suggestion caused a violent reaction and led the Executive to eliminate its LinkedIn publication
- Microsoft has fired 9,000 employees in recent months while investing strongly in AI.
Microsoft has been promoting its AI ambitions in recent years, but the tone of an executive on the AI power for former employees who were recently allowed to go have landed with a deaf noise.
Amid the largest round of layoffs in more than two years, approximately 9,000 people, Matt Turnbull, executive producer of Xbox Game Studios Publishing, suggested that the chatbots of AI could help those affected to process their pain, curriculum and rebuild their trust.
The gesture was intended for support, but left many game developers feeling indignant.
Turnbull took his message possibly well intentionally but definitely badly rounded and timed to LinkedIn. He shared ideas for the indications to give a chatbot of AI who said he could help farewell colleagues navigate professional uncertainty and emotional turbulence.
The reaction was quick and angry, which led him to eliminate the publication, but you can still read it thanks to Bluesky's publication by Brandon Sheffield below.
Matt Turnbull, executive producer of Xbox Game Studios Publishing, after Microsoft's layoffs, which suggests on LinkedIn that perhaps the people who have been fired must resort to AI to get help. He seriously thought that publishing this would be a good idea.
– @brandon.insertcredit.com ( @brandon.insertcredit.com.bsky.social) 2025-07-07t07: 54: 06.534z
Turnbull urged colleagues to rely on AI to reduce “emotional and cognitive load” of loss of employment in their publication, along with fast ideas for 30 -day recovery plans and LinkedIn messages. Probably, the suggestion with the greatest increase in eyebrows was to suggest a notice to help replace the impostor syndrome after being fired.
“No AI tool is a replacement for your voice or experience lived,” Turnbull wrote. “But in the moments when mental energy is scarce, these tools can help you detach themselves faster, calmer and more clearly.”
Even the most charitable interpretation of its publication cannot overlook how condescending and badly timed is the Council. And the angry games developers flooded the comments, probably leading to the elimination of the publication.
To put it gently, they do not agree that being fired is a better emotional puzzle with an algorithm. On the other hand, perhaps a human could understand the career and life that it represents, and how that requires human compassion, tangible support and help, such as an introduction to someone who can help him get a new job.
Therapy with AI
This incident is even worse in the context of Microsoft spending billions building AI infrastructure while drastically reducing its game teams. Using developers dismissed to rely on AI just after losing their jobs is more than hypocritical; He is telling people who use the technology itself that they may have caused their loss of employment.
To be scrupulously and too fair with Turnbull, the use of AI could help with some mental health concerns and could be useful to improve a curriculum or prepare for a job interview. Making AI be part of collection services is not a horrible idea. It could boost the internal training and professional transition arm that Microsoft already offers, which adds to recruiters, curriculum workshops and advice it offers. But it cannot and should not replace those human services. And having one of the people who let you tell you to use AI to find a new job is the opposite of support. It is just an insult in addition to an injury.
The double Microsoft approach to place people and double the AI infrastructure is a proof of the culture of your company as well as its technical capacity. Will we see a new standard where layoffs come with packages immediately instead of advice and compensation? If the message is, “feel free to use chatbots to help you after we fire it,” wait for many more outrageous and deaf nonsense of executives.
Maybe they should ask these chatbots how to interact with human beings without anger, since it is a lesson that they have not learned well.