Hacker Newsnew | past | comments | ask | show | jobs | submit | andr's commentslogin

I know some of those fact checkers. They are career journalists and the bar to tag a post as disinformation is extremely high.

To tag a post, they need to produce several pages of evidence, taking several days of work to research and document. The burden of proof is in every way on the fact checkers, not the random Facebook poster.

Generalizing this work as politically biased is a purposeful lie.


Even granting all that you say is true, it would be trivial for there to be bias in such an apparently rigorous process. All that is required is selective application of the rules.


Did they even have authority to take down posts? That was always Meta's call. The fact-checkers -- which were separate news orgs -- would tag posts.


Yes, you are right. I believe tagging significantly reduced the chance of seeing the post in your feed, so it was similar in effect.


> was similar in effect

Not really. Because if you make the argument that it was censorship then you have to say that any feed that is generated by an algorithm is censorship because the company is determining what, among what all users post, you should see, allowing certain posts to bubble up to the top and others to fall to the bottom.


>...the bar to tag a post as disinformation is extremely high. To tag a post, they need to produce several pages of evidence, taking several days of work to research and document.

Why was the Hunter Biden laptop story thus categorized? As I recall, "several days" did not elapse between the New York Post publication of the story and its suppression on social media.


In my mind, this strongly constrasts with the words [0] of WordPress founder Matt Mullenweg. He says they want to own roughly 5% of the WordPress market, and instead of growing their share of the pie, grow the pie itself.

[0] https://fs.blog/knowledge-project/matt-mullenweg/


Is this because the Wordpress market is huge (almost every website)? Whereas Elastic, Mongo, etc market share is not as huge?


Wordpress is huge precisely because it was open. Elastic is already quite large and would keep growing massively if they stay non-restrictive. Now that they've switched licenses, this may have a significant enough effect over the long-term that they're leaving a lot of "opportunity pie" on the table.


That's great for Matt, but other companies like Elastic can't afford to go from >50% to 5% of their market.


Well, now organizations like Wikimedia are going to have to drop Elasticsearch everywhere despite it powering wikipedia article search and more. So congrats on both missing the GP’s point and also failing to see how this license change would shrink Elastic’s usage / market anyway


They don't need to drop. They're not providing Elasticsearch and Kibana as a paid service to others. You can just read the FAQ: https://www.elastic.co/pricing/faq/licensing Using Elasticsearch and Kibana is still free, unless you're selling them as a service like AWS Elasticsearch does.


People will just switch to OpenDistro.


What legal reason prevents AWS from enforcing its terms of service, to which Parler has agreed? This is a simple question that I feel is ignored by any of the "censorship" arguments.


It's amazing how arbitrarily lines are drawn in this thread. I'd be eager to see people that disapprove AWS's actions what they would say in the following situation:

1) Parler banning some of its own users for trolling.

2) Their own employer/company banning people that use it to distribute child porn.

3) A utility company disconnecting users that continuously disrupt its electrical grid.

Of course you'd be fine with 1, 2 because neither your company nor Parler are common carriers, as defined by law, and have no obligation to provide their service to anyone. Guess what - AWS is not a common carrier either. But even a common carrier such as the electric company can terminate users that purposefully violate its terms of service.

At the end of the day, we all know this. We know that private services have no First Amendment obligation. And yet, thread after thread, its all the same fake outrage that's masking a much simple emotion - "I hate it when my guys lose".


The basic configuration still ships with 8GB of RAM. 16GB is $200 extra. I had an 8GB MacBook Pro in 2013. It wasn't enough then, it's absolutely inadequate now.


If you're not running VM's or opening hundreds of tabs in Chrome it's perfectly fine for everyday tasks.

I'm using my 2016 MBP with 8 GB and have never had a problem except when trying to run multiple VM's.

So 8 GB doesn't seem unreasonable as a minimum. You can upgrade if you need the extra, but a large number of buyers don't need it and prefer a cheaper base price.


Yeah but is this a pro machine? I'd have no problem with 8Gb on an "Air", but for a pro machine to be 16Gb max, it does seem inadequate


A lot of pro workloads are CPU-intensive, not memory-intensive. E.g. a lot of audiovisual work.

The pro line has always been defined by CPU and graphics/display capabilities.

Memory is there to purchase if you're in the subset that has memory-intensive workloads.


It isn't there to purchase, it's maxed out at a measly 16GB. Plenty of workloads are RAM-intensive, but people who need more RAM are forced to spend $500 for a slower Intel CPU and GPU.


Yep, I had a 13" 16GB MBP in 2013. These don't seem very enticing.


8GB of LPDDR3 costs about $30 retail. So they went from a 330% markup to a 660% markup. Nice.


Pretty hilarious, 32gb sodimms are only 150 GBP in the UK, Apple charging 200 for 16. Similarly, 2TB disk for 800 GBP, open market it's 219.

Some PC laptop vendor could do a whole TV campaign based around the idea of turtle-necked folk setting fire to money and smiling, because it makes them feel better, and that's really all that matters[.. right?]


Please let me know where I can buy an 8GB 2133Mhz LPDDR3 module for $30, because that's a very very good price.


Wouldn't LPDDR be more expensive for the retail customer because it is not used that often? Most of the Wintel laptops have standardized on regular DDR4 SODIMMs and it looks like Macs use LP variants, where Apple is surely getting volume discounts.


You can find one on Newegg for $35


The only LPDDR3 modules on newegg is aftermarket ram for iMacs, of which an 8GB stick is $90 for a lower speed.


Partly in their defense, the RAM is soldered to the board.

This limits the spaces available to RAM so they're probably swapping out higher density (i.e. more expensive) chips. They also can't just pull a Macbook off the shelf, slap an extra DIMM in it, box it up, and ship it.

It's their own fault but... yeah.


I've never seen LPDDR3 sold retail, how does it work?


Yeah, on those rare spec chips that are not on the retail market, Apple probably have to place custom order to the memory chip manufacturers for a low-yield production, which is definitely causing the unit pricing to be much higher than the mass produced memory chips. Economy of scale. (even if Apple is still a big customer, it's still a small order to make if no other computer makers use that spec of the chips)


>Apple probably have to place custom order to the memory chip manufacturers for a low-yield production

Except LPDDR3 is not a custom low yield part since they're present in phones, game consoles, SSDs, cameras, VR headsets, etc. They're as off the shelf as they get.


LPDDR3 is not sold on dimms.


I see some on newegg, but not much selection: https://www.newegg.com/p/2E2-003M-000G6?Description=lpddr3&c...


Those are for iMacs, and cost $719.99


It is sold on DIMMs but laptops generally do not have DIMM slots these days.


False, most laptops do have dimm slots. From gaming laptops to XPS and ThinkPads.

Only Macs and the PC ultra portables tend to forgo them.


Surely gaming laptops are a small percentage of laptops sold. I’ve seen more people with Chromebooks than gaming laptops.


Is it easy to install your own RAM in MacBooks these days? I used to do it all the time 20 years ago and I know times have changed.


The RAM has been soldered onto the logic boards for quite a while now. Probably approaching 10 years for the first models without removable RAM.


impossible. It's soldered on the mainboard.


Never say never, sometimes someone with a hot-air workstation could swap chips for denser ones. Never an easy job.


Normal DIMM's have a SPD flash chip soldered to the board which identifies the ram/timing parameters for the firmware/OS. At least on phones, to save a penny they forgo this and frequently the RAM capacity is tied to the model via hardcoded tables in the firmware. It wouldn't surprise me if at least on some of these "laptop" devices with soldered on ram something similar isn't being done.

So, at a minimum your flashing new ram parms to a SPD, or hacking the device model number somewhere. AKA, even with a rework station its likely a lot more than just swapping the chips, if there is sufficient PC/etc linage you might be able to swap the SPD chip as well, otherwise its going to be more than a mechanical heat it up, clean the pads, and drop/heat the new chip.


A lot of Apple hardware is locked down nowadays via hardware identifiers and the T2 security chip; it’s likely the RAM modules are not replaceable by the user even if you had the skill to solder a BGA.


Sure. But I don't consider this be a "user installing their own RAM". That's more of a professional doing a delicate operation.


no Ram or Storage upgrades on Macbooks for many years now.


230% to markup to 560% markup.


This is why it makes more sense to say "3.3x vs 6.6x more expensive". It's better aligned with how people actually think.


Also see benchmarking - 'times as fast' vs 'times faster' vs 'times slower' etc etc etc eternally confusing.


What? I have never seen that as a problem, and I've probably consumed thousands of benchmarking articles over the years. Benchmarks either use the (in my pov stupid) "X percentage units faster" or x faster/slower method. The latter is obvious, the former is not.

As to 'times as fast' vs 'times faster' vs 'times slower': I don't see how there could be any confusion.


> What? I have never seen that as a problem, and I've probably consumed thousands of benchmarking articles over the years.

Possibly you gloss over the words as you're used to them, but don't stop to think what each paper really means? I didn't either until one day I stopped and did.

I wrote a blog post about this, but unfortunately never got around to posting it because I think it came across as rather pedantic. I surveyed the language used to express speedup in systems papers using benchmarking in top conferences and produced this table:

https://twitter.com/ChrisGSeaton/status/1253654667579531265/...

Each row is a different way to talk about the same thing, then the English way to say it, then each column is what this looks like for a different example value, then the actual mathematical expression.

Just to emphasise this - each column is the same empirical result, and then each row in the column is a different way to express that same result, that I've seen in a paper.

> The latter is obvious, the former is not.

I'm afraid it isn't as obvious to everyone, and other people may think it's obviously something different to you. You'll notice that some English phrases can be interpreted in multiple reasonable ways. Is running in half the time '1 times faster' or '2 times faster'? You'll see both in the wild.

> As to 'times as fast' vs 'times faster' vs 'times slower': I don't see how there could be any confusion.

Is '2 times as fast' the same as '2 times faster'? Some papers think so, others not.

Have you read the root comment of this thread... that's an example of real-world confusion right there! Just with markup not speedup. It's the same maths and the same confusion.


Okay. I see.

Unsolicited coaching tip: don't spend many cycles on stuff like this at work.


Because... Chrome.


Not that I agree with Apple’s pricing, but you have to assume this isn’t adding an extra 8GB, but changing the existing packages out for ones with double the density. That presumably is more expensive than just adding the same amount again of the lower-capacity packages.


This isn't a phone, most likely the motherboard has free solder pads for extra RAM chips.



I was looking at that teardown myself to see whether this was the case, and I don't think it's enough information to say -- that's a teardown of the model with 16GB of RAM, so it's not impossible that the 8GB model might leave some of those areas empty rather than switching to lower density chips.


It's also possible that it varies depending on DRAM prices. You might have models with 8GB populated single sided and others double side. Same goes for the 16GB models.


If this inspires you to delete your profile, according to the help section, the only way to do it is by emailing [email protected].


You can delete your account via https://triplebyte.com/privacy-center.


This doesn't seem to work. I requested to delete my profile but I didn't get a confirmation email. However, when I requested to make my profile invisible I immediately received an email confirmation.


Clicked on the delete confirmation link, yet I can still login just fine?


There was another site that had 30 day deletion policy, but if you logged in during the 30 days then deletion was automatically cancelled. Squirrelly behavior obviously.


I think that's what's going on here. Got another email today letting me know that my request to delete my account is complete, yet I could reset my password just fine.


It would be super interesting to see if this knowledge generalizes with transfer learning. For example, after seeing 50,000 episodes of PacMan, would the GAN be able to recreate Space Invaders with just 5,000 extra episodes?


This is naively bad advice, in that it doesn't educate the reader, rather it just creates stigma.

Threads are a fine tool for many jobs. You just need to understand what you are doing. Reduce your shared state. Use locks when more than one thread may write data simultaneously. Use queues/channels for message passing. Profile your lock contention when things get slow. Learn a functional language, so that you get the hang of writing no-side-effect functions and using immutable data structures. This will immensely help you with writing concurrent code.

You should use threads.


I admit this is just a shower thought, but airlines could really help in the current situation by transporting sick people across the world (e.g. from Italy to China) to even out hospital demand.


Don't think thats going to fly (pun intended). It's not dissimilar from some of the cruise ships that can't even go into an harbour?


Yeah, but no one wants a planeful of sick people being loaded up at their local hospitals.


Do you think that might be hard to do given the content of the article you commented on?


They would certainly be paid for their services.


Sounds more like a way to increase the spread of viral clusters.


Isn't flying uncomfortable for everyone and dangerous for ill people? Dry air, pressure changes, limited space...


Generally if you're sick enough to need hospital care, traveling even by ambulance is a challenge


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: