That's always the main issue with any piece that is using ill-defined terms like intelligence, consciousness, self-consciousness, thinking, understanding, etc. Nobody ever came close to defining them in a practical manner in decades/centuries, but then LLMs came and suddenly lots of people are somehow absolutely sure that they don't do any of this while humans/animals do.