Hacker Newsnew | past | comments | ask | show | jobs | submit | reverse_no's commentslogin

i would say from now on links to cnn should be automatically lite-ified. they have a plain-text version of all their articles. link to that instead.

every once in a while i read something or hear a narration and i cant help but suspect that GTP had some part in writing it. this article gives me very strong GTP vibes. i remember in 2018 i realized that i would never again be able to assume that an online account, post, comment, article was written by a human. and here we are. think about the author of this article. he would almost be forced to use GTP because it makes the writing and composition so much better. so much smoother. i dont think this is a good situation we are wading into…


this is mental gymnastics. i really think HN would be better with a reputation system so that posts like these will be checked for accuracy after the fact. accounts that accumulate tons of wrong predictions can be labeled and easily ignored. it would help with this problem where a huge wave of the same wrong opinion floods the comments and everyone else gets greyed out. why do we let wave after wave after wave happen without any accountability? like when half of HN confidently predicted twitter would crash… im not saying it was a bad guess but lets make it mean something to be wrong!


> i really think HN would be better with a reputation system so that posts like these will be checked for accuracy after the fact. accounts that accumulate tons of wrong predictions can be labeled and easily ignored. it would help with this problem where a huge wave of the same wrong opinion floods the comments and everyone else gets greyed out. why do we let wave after wave after wave happen without any accountability?

I don't understand what your point is - the accuracy in question is in a linked article submission, not an HN comment, no?

Which account would be getting labeled as such? The link submitter? I suspect that's too diversified to really be effective.


the article is an example of an opinion that often shows up in comments. it would make sense to have a rank based on the accuracy of the sentiments of your posted links as well as comments. all of this would be difficult to implement but thats besides the point.

for example, if a guy constantly posted articles that explain why EVs can never be viable, and he did that 2010 to 2020 then it would be nice to have some convenient way of knowing that this persons links are usually not very useful. and if a person posted a bunch of links about a specific therapy years before it became a breakthrough treatment then i would want his links pushed to the top…


Well, in lieu of a forum implementation automating that, you can already scrutinize a submitter's submission history via their profile.

In this particular case, it's their first and only submission, so what you're proposing would do absolutely nothing.


well, no. a reputation system could prevent poor predictors from hijacking threads, attenuate their moderation power, overall decrease the influence they have. and obviously that would take time to have an effect. im not sure how you could not understand that.

i dont understand why you bring up reading peoples comment history. thats totally unrelated


> i dont understand why you bring up reading peoples comment history. thats totally unrelated

Comment history != submission history.

Maybe I just plain don't understand what you're on about at all.


sentiments of comments and posts contributing to a rank… wow


Here's an idea: decide if you believe the article all on your own, through independent research or your gut.


i already do that. my gut cant stop thousands of idiots from greying good coments


well, given that industry people and “experts” confidently predicted that what has happened in 2023 could happen only hundreds of years from now or not at all… i would say that the “hype” has as much veracity as anything else. the simple fact os that no matter how carefully you curate your news, you will have no idea whats coming. and coming soon. most people havent wrapped their head around it yet but the plain and obvious truth is that technology, especially AI, must be slowed, paused, regulated in some way because too much change too fast is very dangerous.

please dont comment about previous eras of technology and change. nothings even comes close to comparing


wow. i remember using omegle in 2010. the whole chat roulette craze. this guy dresses up the issue in some pretty overly dramatic and sentimental clothing… in reality this is the failure of yet another company that uses the old model. the model of the early internet where everything is free and everyone is anonymous. its just unviable and becomes less viable as the internet grows. captcha is broken… the old model is dead.

when you have a free service and broken captcha then you will be a magnet for crime, spam and you will hemorrhage money. maybe youll get advertisers if you sanitize the platform and now youve defeated the point anyway. or you can sell user data. at the end of the day people have to pay.


That's a depressing viewpoint.

The text chat version of Omegle could have easily been hosted on a single server with some kind of automated spam protection. Donations could have more than covered the costs to run it. The positive value it added to millions of lives far outweighs the negative.


depressing that people have to pay for the services and goods that they use? well i do agree that its pretty depressing that captcha is broken. AI seems ever closer to displacing humans… in the mean time we have to pay for internet services


Monthly users reaching 70 million. I doubt a basic server could handle that.


You’d be surprised what a well optimized server can do. Moores law hasn’t stopped. 70 million is a pretty low number, when modern $40 servers can easily do 10-100k requests per second.


the site you are currently leaving a comment on operates on the old model. so far it's working pretty well.


Ironically it is technically VC backed


I wonder what it'll have to do to get to an IPO ;)


most significant internet growth was in the 2000s. It's not like some magical growth threshold crossed in 2010s, More and more of this audience was born internet-native too. It seems to be a cultural shift instead


i tried to extract assets from the ps2 version of 007 agent under fire. i thought it would be straight forward but it wasnt. i came to the conclusion that most of the game was compressed. it was so interesting to see this guy blow past where i got stuck. i made it as far as looking at the code as it was running in psx2… but i found their debugger hard to use. and iirc it doesnt let you simply dump memory, which should contain decompressed assets, into a file. the next thing would be to just look through the code for decompression function but i didnt even know how i would do that. i traced out function calls, starting with main, but that tree grew lots of branches very quickly and again i just gave up. maybe i would give it another go but i dont really see the point when the next GTP will tackle the whole thing automatically in a year or two when im 80% done and hundreds of hours sunk into it.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: