I know nothing about this but I'm hoping that I pieced the story together correctly:
From 1999-2001 there was a hoax on the internet - a person named Kaycee - who supposedly had lukemia and died. [1] She was primarily run by a 40 year wold mom in Kansas. The puppeteers gave Kaycee such an elaborate persona that many people truly believed she was real. It's heartbreaking seeing some of the things people wrote.
> I am one of those folks who was emotionally involved in Kaycee's life. I believe ... that she was a real person. ... Kaycee was here. Now she is not. And I miss her very much. [2]
This post is supposed to be a joke. This is someone playing along with the story, saying that Kaycee actually wasn't real... but because she was actually an AI developed in an MIT lab that was so dangerous that the researchers shut it down...[3]
The funny thing is this is actually plausible now.
Earlier today I found an entire gang of youtube channels that try to act like they're actually being run by real-life attractive women. The youtube channel has links to instagram pages that try to look convincing real and authentic. They comment on each other's videos in order to try to convince viewers that they're real. People aren't making fake AI people because it's funny, they're bored, or because someone wants to destroy the world. They're here now to make easy ad revenue.
Earlier today someone wrote that there are only 500 people on the Internet, since 2002...
I was reminded of this late April fools joke from metafilter in (it turns out) 2001, where the very popular site just displayed this message, implying that everything was artificially generated text, except for a few unwitting human participants... Including, presumably, the reader of the message.
It could possibly have to do with his sister's allegations. It's one of the top autocomplete results when you google "sam altman", so people are definitely talking about it.
I actually did a more empirical approach to this problem recently.
https://kylediaz.com/post/scraping-emails-hackernews/#llm-sc...
I had similar results.
Email obfuscation like name [at] domain [dot] com is trivially found by both regex and LLMs, but emails like name@[my domain] or name(delete me)@domain.com are harder for LLMs. LLMs could find those emails, but wouldn't try to obfuscate it no matter how I prompted it. It's probably a skill issue on my part, and I'll have to try that neat "think step-by-step" trick.
IMO, the best and easiest way to obfuscate is using invisible HTML elements:
name<span style="display: none">you can't see this</span>@domain.com
It's technically scrapable (as I show in my post), but it gives way more "security" than [at]/[dot] while still allowing users to just copy/paste it.
What made you add support for email subdomain addressing? Is it a feature a lot of customers request? I recently incorporated it into my workflow and love it, but I honestly can't recall ever witnessing anyone else use it.
I knew that plus addressing was inherently flawed in that it required the support of the plus "+" character. So I implemented subdomain addressing to ensure wide comparability. And it looks nicer.
I know nothing about this but I'm hoping that I pieced the story together correctly: From 1999-2001 there was a hoax on the internet - a person named Kaycee - who supposedly had lukemia and died. [1] She was primarily run by a 40 year wold mom in Kansas. The puppeteers gave Kaycee such an elaborate persona that many people truly believed she was real. It's heartbreaking seeing some of the things people wrote.
> I am one of those folks who was emotionally involved in Kaycee's life. I believe ... that she was a real person. ... Kaycee was here. Now she is not. And I miss her very much. [2]
This post is supposed to be a joke. This is someone playing along with the story, saying that Kaycee actually wasn't real... but because she was actually an AI developed in an MIT lab that was so dangerous that the researchers shut it down...[3]
The funny thing is this is actually plausible now. Earlier today I found an entire gang of youtube channels that try to act like they're actually being run by real-life attractive women. The youtube channel has links to instagram pages that try to look convincing real and authentic. They comment on each other's videos in order to try to convince viewers that they're real. People aren't making fake AI people because it's funny, they're bored, or because someone wants to destroy the world. They're here now to make easy ad revenue.
[1] https://en.wikipedia.org/wiki/Kaycee_Nicole [2] https://www.metafilter.com/7819/Is-it-possible-that-Kaycee-d... [3] https://metatalk.metafilter.com/608/Plastic-gets-the-story-a...