AI skeptics aren’t alone in advising users not to fully trust model outputs—AI companies caution this themselves in their terms of service.
Microsoft, aiming to attract corporate clients for Copilot, faces criticism on social media regarding its terms of use, last updated on October 24, 2025.
“Copilot is for entertainment purposes only,” the company warns. “It can make mistakes and may not work as intended. Don’t rely on Copilot for important advice. Use Copilot at your own risk.”
A Microsoft spokesperson told PCMag that they are updating what they call “legacy language.”
“As the product evolved, that language doesn’t reflect Copilot’s current use and will change with our next update,” said the spokesperson.
<Tom’s Hardware reported that Microsoft isn’t alone in using such disclaimers for AI. For instance, both OpenAI and xAI advise users not to see their output as “the truth” (xAI) or as “a sole source of truth or factual information” (OpenAI).
