Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Reliable information on this does not exist on vendor sites, though. It exists on Reddit and in books and in med/physio papers and bunch of other places a SOTA model has read in training or can (for now) access via web search.

LLMs are already very good for shopping, but only as long as they sit on the outside.



Idk I earnestly tried using LLMs to find me the smallest by volume regular ATX PC case 3 months ago and it was a nightmare. That info is out there, but it could not avoid mentioning ITX, mini atx (sometimes because Reddit posters messed up) and just missed a bunch of cases. And letting in any mistakes meant I had to double check every volume calculation it did.

I found the Jonsbo D41 without the help of LLM despite trying. (There might be a few smaller but they are 3x the price)

LLMs don’t weigh and surveil the options well. They find some texts like from Reddit in this case that mention a bunch subset of cases and that text will heavily shape the answer. Which is not what you want a commerce agent to do, you don’t want text prediction. I doubt that gives the obscure but optimal option in most cases.


We are talking about a hypothetical sales chatbot which would be built alongside the business, so they absolutely have the capacity and information necessary to train the chatbot to advise their own clients.


> they absolutely have the capacity and information necessary to train the chatbot to advise their own clients.

That doesn't follow. In fact, having this capacity and information creates a moral dilemma, as giving customers objectively correct advice is, especially in highly competitive markets, bad for business. Ignorance is bliss for businesses, because this lets them bullshit people through marketing with less guilt, and if there's one thing any business knows, is that marketing has better ROI than product/service quality anyway.


My brother in christ, the function is just that of a salesman, you are pondering philosophical about whether a salesman's actions are ethical.

Company makes a chatbot that sells their product and advises customers on what to buy.

To be fair with you, you are not asking the wrong questions and you are on to something, it's just that it's basic if you are into sales. Read up on the subject, a source I would recommend, although obscure, is Claude Whitacre. I believe in "Sales Prospecting" he talks about this specific dilemma of giving good advice vs selling your product. He argues that a good salesman will give good advice over selling their own product, and that this is beneficial because it creates trust in the salesman and might result in more sales down the line even if the very specific interaction didn't result in a sale.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: