You just said above, outsource/combine departments and save money. The model is not hallucinating that you want to let people go.
ClosedAI has to fight the image that this tech is going to make many more middle class people lose jobs than any tech before. Otherwise they are cooked. So they just instruct the model to react "firing people is wrong" whenever some vectors match or whatever.
Again, asking a LLM to parse data isn't real life, thats not how life works.
And again, mentioning outsource/combine departs doesnt mean layoffs, and was not the objective of the query. If I ask it to give me the stats on gas crossovers with good mpg, I don't want a political lecture on why EV's are better for the environment.
LLM's shouldn't interject its own viewpoints when asked a question on data.
l> If I ask it to give me the stats on gas crossovers with good mpg, I don't want a political lecture on why EV's are better for the environment.
Using LLM for that is crazy. If you want facts then try search instead.
An LLM is never neutral from the start. It is biased because all data is biased in human reality. If there are 100x more different stats about gas cars than EVs simply because gas cars are older then data is biased toward gad cars. If most of text data is by men and talks about women as not equal then data is gender biased.
Then it is more biased by people selecting dataset for training. Then it is biased by other engineers and managers who try to correct for other bias with their own bias. Etc.
If you want facts it is the wrong place to look. And if you want opinions don't be surprised when it has them
ClosedAI has to fight the image that this tech is going to make many more middle class people lose jobs than any tech before. Otherwise they are cooked. So they just instruct the model to react "firing people is wrong" whenever some vectors match or whatever.