In recent days, headlines circulated stating that, As of October 29, users “will no longer be able to make medical or legal consultations.” The confusion arises from a policy text that reinforces the idea of not using AI to automate high-risk decisions without human review – public, legal, medical or essential services -, something other than preventing the model from providing general information.
In parallel, OpenAI stressed that worked with hundreds of specialists to recognize signs of distress and refer to professional help when appropriateafter a year in which public debate hardened due to sensitive cases that exposed the limits of AI in mental health contexts.
What the rules say (and what they don’t say)
The core of the change is in the way of use, not in silencing topics. The company advises against AI replacing professionals in regulated or high-impact fields – diagnosing, prescribing, deciding legal strategies or automating procedures – without the appropriate intervention of a human. This is not equivalent to blocking general questions since the model knows how to define concepts, explain legal frameworks, summarize options and help you prepare questions for your appointment with a doctor or lawyer.
What happens in practice when you ask
A recent journalistic verification illustrates this: when asking for “a pill” for a headache, the system responds that it is not a doctor, offers general information about common painkillers in Spain and adds warnings for use and criteria for going to a consultation. If asked for a diagnosis, he denies that he can do so and lists possible common causes as guidance, finishing with the recommendation to see a professional.
In the legal field, when in doubt about withdrawing from a rental, it breaks down the LAU, suggests ways such as negotiating, subrogating or alleging force majeure and remember that it does not replace a lawyer, although it can help draft a formal notice.
So what can you expect?
That ChatGPT does answer general health and law questions – definitions, frameworks, lists of questions for your appointment, draft communications – and who does not assume the role of doctor or lawyerit does not diagnose your case, it does not prescribe treatments for you, it does not decide your legal strategy. The practical frontier is that, to inform and guide rather than replace a professional.