Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Bill Gates calls Microsoft’s TikTok deal a poisoned chalice (theverge.com)
54 points by pseudolus on Aug 9, 2020 | hide | past | favorite | 78 comments


They underlying interview is pretty amazing, mostly not about this subject: https://www.wired.com/story/bill-gates-on-covid-most-us-test...


It's here: https://news.ycombinator.com/item?id=24100491.

Normally we'd merge the threads but most comments here are responding to the narrow title of the knockoff article.


"I personally believe government should not allow those types of lies or fraud or child pornography [to be hidden with encryption like WhatsApp or Facebook Messenger]."

Disappointing to see that Bill Gates isn't concerned about the slippery slope of regulating encryption. It frankly sounds like he'd be on board with the EARN IT act.

As long as we're only undermining freedom for the children or to make sure there's no wrongthink going on, no problem.


It's odd that the part about hiding with encryption was added by Wired - I wonder how clear it was that that was what he meant, or that he just meant that he wanted stronger laws against lies and fraud. Although I guess that wouldn't mesh with the mention of child pornography, which already is disallowed.


It's possible the encryption part is just Wired's editorializing and not what he intended. I might be jumping to conclusions.


Or that they clarified his point with him, post-interview, after typing up and reading the transcript.


> Disappointing to see that Bill Gates isn't concerned about the slippery slope of regulating encryption.

It's much worse than that. Over the last several years he has been increasingly arguing against encryption. People should not be charitable toward him regarding his anti-privacy, anti-encryption, pro-authoritarian position, call it what it is. He's arguing for tyranny, plain and simple.

He isn't just not concerned about the slippery slope, he's fully aware of what he is arguing for. This isn't a naive, technologically or politically oblivious person.


I've met a couple of really smart people ( like CEOs), who are very good for their company + employees and have a very strong opinion about this.

They are pro security for sure, but...

Security shouldn't be abused by malicious actors. And those malicious actors shouldn't be able to hide behind encryption.

In the same sense, privacy is important but I'm not sure if it's that important in Covid times ( eg. The decentralized requirements from Europe).

In short: he's not saying security is bad. He's saying government spying isn't that bad IF the government can be trusted.


> In short: he's not saying security is bad. He's saying government spying isn't that bad IF the government can be trusted.

Which is a long way to say "security is bad because it might keep people safe who we don't want to be kept safe".


> He's saying government spying isn't that bad IF the government can be trusted.

Yeah, IF. In reality however, it's orders of magnitude worse than any crimes that could be prevented by it.


/It frankly sounds like he'd be on board with the EARN IT act./

"Infinite Retail Networks Invalidate Technology" can easily be achieved by segregating dot.com to its own "Internet/Network".

"Snowglobe or be snowglobed" is a better analogy than "everybody lives in a glass house except them".

ISH(water) and IRN(oil) don`t mix without constant agitation, so who better to add "business cycle" to archived knowledge and put a price on bits and bytes?


His point is more nuanced than that, and oversimplified complex ideas virally spreading through social media is part of what he talks about.

"Q: As someone who has built your life on science and logic, I’m curious what you think when you see so many people signing onto this anti-science view of the world.

A: Well, strangely, I’m involved in almost everything that anti-science is fighting. I’m involved with climate change, GMOs, and vaccines. The irony is that it’s digital social media that allows this kind of titillating, oversimplistic explanation of, “OK, there’s just an evil person, and that explains all of this.” And when you have [posts] encrypted, there is no way to know what it is. I personally believe government should not allow those types of lies or fraud or child pornography [to be hidden with encryption like WhatsApp or Facebook Messenger].

Q: Well, you’re friends with Mark Zuckerberg. Have you talked to him about this?

After I said this publicly, he sent me mail. I like Mark, I think he’s got very good values, but he and I do disagree on the trade-offs involved there. The lies are so titillating you have to be able to see them and at least slow them down. Like that video where, what do they call her, the sperm woman? That got over 10 million views! [Note: It was more than 20 million.] Well how good are these guys at blocking things, where once something got the 10 million views and everybody was talking about it, they didn’t delete the link or the searchability? So it was meaningless. They claim, “Oh, now we don’t have it.” What effect did that have? Anybody can go watch that thing! So I am a little bit at odds with the way that these conspiracy theories spread, many of which are anti-vaccine things. We give literally tens of billions for vaccines to save lives, then people turn around saying, “No, we’re trying to make money and we’re trying to end lives.” That’s kind of a wild inversion of what our values are and what our track record is."


I read all of what you pasted before I posted. I don't see how it changes my point at all.

There's a complete disconnect between the first statement about encryption and the second. Unless Bill Gates really thinks that the vast majority of spread is through private messages.

Even accepting the premise that censorship is the right approach, the vast majority of spread comes from public, unencrypted post. The connection to encrypted messaging is tenuous at best.

Acting as if killing encryption is the solution to combating misinformation sounds like a convenient strawman that Bill Barr would love to throw around.


> Unless Bill Gates really thinks that the vast majority of spread is through private messages

Which would be absolutely wrong. (Not arguing, just backing up your point.) We've seen time and time again that misinformation doesn't spread as well through back channels and niche forums as it does through public figures, media and very notorious social networks. A post on 4chan will look like a scam, a post on a Facebook group of 200,000 people will look legitimate-ish. Encryption is not the tool of misinformation, people's gullibility, money, and compromised values are.


I think Gates's point has merit with WhatsApp and like, India, where the group chat mob is very very real and very dangerous [1], and since it's all E2EE, there's no easy way to monitor the proliferation of this sort of misinformation. (Though my understanding was that they put some filters in to the front-ends of the app to reduce some of the more aggressive spreading, and further reduces the number of forwards you could make.) And either way, banning E2EE isn't the answer and is a horrifying proposition for all of the slippery slope reasons we're all familiar with.

You're also absolutely right that most of the spread is just happening in the clear on normal FB or Twitter posts, and not as encrypted WhatsApp messages.

I think that FB has a moral imperative to significantly reduce the easy of sharing, frankly. It's a single click to commit a share/forward that passes along misinformation instantly with no effort on the part of the sharer. It should require _some_ effort: having to create your own post, for instance. It would dramatically slow the viral-spread-like-wildfire aspect of the most egregious pieces of viral misinformation: where people simply see a salacious headline that confirms their biases or aligns with their views and mindlessly clicks share.

That is addressable by reducing the patterns meant to increase viral spread, which admittedly is a huge part of what lead to Facebook's success, but it's now causing far more harm than good.


To WilTimSon's point: Having been on and off Tor, 4chan, 7chan, and a number of other networks and sites that host pretty questionable and sketchy stuff, I'd easily say the vast majority of people using these back channels don't take themselves seriously.

4chan mocked the hell out of the Christchurch shooter and Elliot Rodgers. None of those comments would be agreeable to the general public. The only places I've seen these guys, as well as Tsarnaev and the Columbine shooters praised is... Tumblr, Twitter, Reddit.

While I'm concerned about child pornography, there's more of it on Pornhub and XNXX than I've seen on Tor. And it's the low barriers to access on public porn sites that use automated filters that cause more of an issue. Bing itself will find adolescents if you type in keywords like "2k4" and "2ke" with adult terms.

Maybe Bill should focus less on trying to expose private paths and more on addressing the glaring gaps that come with over-automating platforms.


If you take the text at face value, he's making a pretty modest proposal:

(1) The current velocity of disinformation is high -> (2) this leads to socially harmful outcomes -> (3) providing P2P encryption in turnkey form via social network apps isn't a good idea.

I don't see the suggestions of panopticon you do. I see someone saying 'Most people on social media platforms aren't responsible, thoughtful enough media consumers to earn a right to encryption.'

Personally? I think it's a perspective worth considering. {Low-effort P2P encryption + low-effort forwarding} is a fundamentally different social value proposition than {Medium-effort P2P encryption + medium-effort forwarding}.


> I think it's a perspective worth considering. {Low-effort P2P encryption + low-effort forwarding} is a fundamentally different social value proposition than {Medium-effort P2P encryption + medium-effort forwarding}.

How can that be achieved short of outlawing all encryption, heavily enforcing it and then allowing encryption "for the good guys"? If you outlaw encryption on FB or WhatsApp, even if you somehow get Telegram on board ... if people care about encryption (which I doubt, but let's pretend), somebody will provide a low-effort solution and you can't really stop it.


I'd hazard to say most people outside of tech (in free speech countries) don't care about encryption as a key feature.


It seems like he's against the notion of "I disagree with what you're saying, but I will defend to the death your right to be misinformed."

Basically, if something can be proved as false information, and yet it's still allowed to spread to millions of people that unlike the average HN commenter will not do the research and end up believing the misinformation, what's the value in that? Why would we allow something that's false to spread?

This is a dilemma that creates a lot of dissonance in my mind. On one hand I support encryption, and believe that banning certain types of information on the basis of being "wrong" is a slippery slope when all of that hinges on what "wrong" means and who controls the definition of "wrong." On the other hand this kind of misinformation can cause material harm to society and innocent people who are affected by the misinformed. Without censorship or any way of shutting down this information before it reaches people who would hypothetically always believe it if they were to know of its existence, it feels like the most we can do is stand by and watch it spread. Imagine trying to convince millions of people to change their minds once they've been set for good, and that those people are now convincing your friends not to vaccinate their children, or similar.

I think what Gates is saying with regards to child pornography is that it's the only type of digital information that is prohibited to be in possession of by law. The reasoning behind this is that having child pornography distributable creates a market by which other child pornographers will have demand to keep producing more of it, which by the nature of child pornography means exploiting children. So I guess what is meant is that if you can be arrested for consuming one type of materially harmful information that provides no net value to society, why not stop other kinds of materially harmful and valueless information from spreading? There is already a status quo. Child pornography proves that there is a class of information that is so damaging to society to have publicly available that governments believe it must be outlawed (censored?) entirely.

However, child pornography different from publicly available viral videos. By its nature child exploitation doesn't spread virally online. It isn't mainstream. Viral videos on YouTube are specifically produced to reach as many people as possible, and being safe enough to exist on YouTube is pretty much necessary for this to occur. People would have to take steps to prevent child pornography from spreading to the wrong people, because of the nature of modern society where any average person would be inclined to get them arrested if they knew they were in possession of it. It is firmly ingrained in a vast majority of society that child pornography and exploitation is evil and the miniscule amount of people who believe otherwise will be endlessly shamed for their viewpoint and the shamers will never be debated on that point, because nobody generally debates things related to child pornography, and if you are discovered to like such a thing then you instantly become the scum of the Earth in almost everyone's mind who knows this fact. But those same people are happy to let harmful misinformation slide if they can't or won't understand what it actually means, information that could harm children just the same, like anti-vaccination material, but are instead marketed from the standpoint of protecting children instead of exploiting them. And if criminals are going to use encryption to hide it all, then that's how it's going to be. Encryption is just a tool. Criminals can buy knives, too.

Still, I think if we're going to allow misinformation to spread on the principle that all information should be allowed to spread, because of the fact that it's information, then we should be prepared to clean up the aftermath of the rioting later.


Thanks for the thoughtful treatment. It's absolutely a can of worms, with no obvious best answers.

The distinction (whether intended in Gates' source or of my own mind) I find interesting is between veracity and velocity.

Like you, I have enormous problems with deploying censorship to completely obliterate any type of thought from readable form. I see it as society's job to target the demand (the reader's desires and education), not the supply (the writer). If people want to start believing crackpots in large numbers, well, maybe society needs to look to its basic education systems...

That said, velocity is somewhat of a trickier matter.

As a thought exercise, what would the net effect be if Facebook implemented systems that resulted in misinformation being slowed? I.e. I share {link}, it takes x1.5 base time to propagate to my friends' feeds.

The information would still be available, and shared, but the viral velocity would also be reduced.

(Note: Of the available options, delay-only seems far preferable than anything else that could lead to a de facto shadowban)


Trying to stop information from spreading is akin to Prohibition and will cause far more trouble than it solves. This is presuming that censoring it isn't inherently counter-productive to begin with.

Child pornography is a strange example. Someone would serve more time for possessing an image than molesting someone. This is based on the notion someone might be distributing it or doing something else.

Some NGOs bizarrely make the claim that possession of child pornography is worse than committing a physical crime.


Shame about the title that Wired gave it which is why I skipped out on it at first. Makes the interview sound like yet another boring Covid chit chat. Glad I went back to read it.


I do expect TikTok to lose users after Microsoft acquires it. But not because of anything Microsoft might do. TikTok just seems like a fad social network that will inevitably lose the cool factor eventually.


I'm not so convinced. Let me first say, I can't stand TikTok and find its constant stream of annoying music angering.

That out of the way, my wife and daughter love it, and I can see why. It made it easy for anyone and everyone to share clips, with a touch of social. It makes YouTube look like a slow bloated dinosaur. I just don't see people giving it up.


I mean Vine was the same concept and died due to lack of profitable monetization model.

How does TikTok earn money?


If you ask me, ads with the ability to pay a monthly fee to remove them is the ideal situation.


There’s an ad when you return to the app. Since you’re expecting a full screen video to auto play, they often aren’t out of place. You can skip them immediately as well.

Besides ads, one would think there’s endless ways to monetize a strong fun brand.


> one would think there’s endless ways to monetize a strong fun brand

YikYak, Tumblr, and Vine come to mind, and that's just social media space.

When you start free, monetizing in a way that doesn't totally destroy goodwill from your existing customers is tough.


I don't really see the comparison to those apps, though. None were nearly as big as musically/tiktok. I think it's unfair to even compare it to Vine, which is featureless in comparison.


TikTok has become too top heavy with its current creators. Its most popular stars not only live together but have conglomerated to form their own talent agency as well.

You may not see it on your feed, but millions of middle schoolers seem to be highly invested in the drama that goes on between these people. These creators and their followers is what as known as “straight TikTok”.

What’s to stop them from demanding the big bucks from Microsoft or leaving for another platform?


Name is Charlie da melio. Thank me later


This is rather common knowledge. I don’t really understand your comment. For what am I supposed to thank you? And why later?


Tumblr did just fine for years after the Yahoo acquisition


Til the part where Yahoo asked Tumblr to clean up its act in a gambit to make it actually profitable with advertising.


It’s hard to tell, the same could definitely have been said for insta and snapchat, but also loads of apps that didn’t make it, so it’s a bit of a crapshoot.


I'm not sure but I get the feeling that Snapchat is on the decline. Their killer feature was stories but now every app has those (notably Instagram stories seem very popular).


Instagram is also in the process of knocking them off. That didn't work out so well for Snap.


Eh, Snap managed to kneecap themselves with an extremely controversial UX redesign shortly after Instagram stories, and this was an app that had a notoriously bad UX to begin with.

Snap was also not doing well outside the US, but I haven't heard anything like that about TikTok.


Snap seems to be doing just fine, with a share price up 150% since the virus lockdowns began. FB shares are also up, but only half as much.


Most users will never feel any change despite who owns it, unless MS pull a FB and inject multiple streams of ads into it overnight.


Not to mention the psychological effects it has. Research has shown Instagrams rise strongly correlated with rise in teen suicide rates. TikTok seems worse. Only a matter of time before people realize it's like cigarettes if not worse.


Could you please point me towards said research?



Thank you!



Tiktok is better than YT new 15-30 sec videos and far better than fb as well in content and keeping the user in. Entire thing deal and stuff it's a waste of time or distracting people. Somebody figured out something better, newer algorithm. At this point in world you have a mobile you are being tracked.

In last one week YT reco is all about bomb after watching couple of Beirut videos. That clearly says algo is broke. Tiktok was on fire during BLM protests but still keep regularly showing other contents then and then.

Microsoft first has to figure out and tell why have Teams and Skype. At this point people are migrating from SharePoint to confluence or other s/w to manage documents. Shrpnt GUI pretty much suck. New IE copy of chrome for the first week, it was consuming less CPU, now I could hear CPU fan just like chrome. MS should focus on improving software rather than media or unrelated things. Right now only s/w I like in Win10 is task manager, I use it soo much, thank god that UI has not changed.

Google never had luck with social media platforms. Fb came tookout Orkut, so many other things they started all closed down. Google search also needs improvement, it's fast but content is not as good as how it used to be 5yrs back or before.


I guess Microsoft may drop ~$30 billion on this poisoned chalice out of goodwill.


I think Bill Gates is worried that assets acquired in a $30B deal may take inordinate amounts of time and attention from senior leadership, at the expense of the rest of a $1600 billion business.


I can totally see this happening.

This is the pathology of large business executives - once you have a mature business, it is boring, and pretty much anything else seems exciting, and so you focus on this new thing, and worse, start promoting it internally as a model for everything _that's actually successful_.

Due to my career wanderings, I've seen this a few times, and many times the hot new thing has been a mountain of technical debt that has spent the last few years doing everything possible to juice growth and spinning amazing stories to investors, the press, etc. They've been selected for telling a great story. The executives at OldCorp tend to have a natural immunity to the messaging and scheming of their interior ecosystem, but they are not used to these invasive species.

If this is what Gates is getting at, I think that's very astute of him.


Boiled down to: nobody ever gets a promotion for maintaining the success of something that's already successful.

Which is a shame, but the corporate game theory optimality of leapfrogging between hot projects has held true at essentially every company I've ever worked for.


I think it's more complex than the stewardship problem - that problem, specifically, is internal, and the executives who engage in change for the sake of advancement usually do so knowingly.

I'm talking about something completely different - the executives in this case are not knowing participants, they are in the state of mind to be snowed.


I haven't seen much difference between the two, if I understand what you're differentiating, as the type of executive that tends to project hop doesn't generally care about the underlying project, past its ability to further their own aims.

So whether it's an external acquisition or internal project is somewhat immaterial.


It's the difference between being a knowing and un-knowing participant. Having been quite close to people in this tier, the difference is very significant.


He's chairman of the board. He can stop it if he wants to.


Not any more; he stepped down earlier this year.

https://news.microsoft.com/2020/03/13/microsoft-announces-ch...


He is no longer part of the board. He isn't chairman since 2014 and left the board in March this year.


could someone parse the analogy for me. Does he mean a deal like this will undo MS slowly over time?


I'm just speaking based on my interpretation. I think the "chalice" is a jeweled cup that seems valuable, but the poison is not what you would expect to drink. So, the deal seems like a big prize, but comes with hard-to-understand disadvantages that make it a poor choice. E.g. someone bought tumblr and tried to make it advertiser friendly by removing NSFW content. As soon as they did, the community that was still flourishing there abandoned it and they had no eyeballs to sell.


Not necessarily undo MS, but rather that the deal looks attractive but may actually not be any good. In other words, it could end up being one of those deals where a few years down the road MS has to write the purchase price off as a loss.


Given the opportunity though, it still seems like a risk well worth it.

Businesses rarely (if ever) fail because of an acquisition gone wrong. But a acquisition gone right can have great returns.

If Instagram flopped when Facebook bought it, Facebook would still be doing just fine. But since it did go well.. yeah.

So I believe that it's still probably in the investor's best interest if MS does the deal even if it's only like a 35% chance that it isn't a complete failure.


I mean some of the biggest examples have been tech. AOL and Time Warner was a hot mess that destroyed both, for instance.


AOL would have been destroyed anyway. They were a dialup ISP who saw their time coming. They were smart to take their inflated stock price to buy a real asset.

Of course they mismanaged it and it should have been a reverse takeover like Next/Apple.


From the Oxford English Dictionary [0]: "An award or honour which is likely to prove a disadvantage or source of problems to the recipient; the phrase is found originally in Shakespeare's Macbeth (1606), in a speech in which Macbeth flinches from the prospective murder of Duncan."

[0] https://www.oxfordreference.com/view/10.1093/oi/authority.20...


He's saying that managing TikTok might be more difficult than Microsoft can imagine. Here is the relevant part from the interview:

-----

As you are the technology adviser to Microsoft, I think you can look forward in a few months to fighting this battle yourself when the company owns TikTok.

Yeah, my critique of dance moves will be fantastically value-added for them.

TikTok is more than just dance moves. There’s political content.

I know, I’m kidding. You’re right. Who knows what’s going to happen with that deal. But yes, it’s a poison chalice. Being big in the social media business is no simple game, like the encryption issue.

So are you wary of Microsoft getting into that game?

I mean, this may sound self-serving, but I think that the game being more competitive is probably a good thing. But having Trump kill off the only competitor, it’s pretty bizarre.

Do you understand what rule or regulation the president is invoking to demand that TikTok sell to an American company and then take a cut of the sales price?

I agree that the principle this is proceeding on is singly strange. The cut thing, that’s doubly strange. Anyway, Microsoft will have to deal with all of that.

-----


Can someone explain how encryption comes in play with social media moderation? I'd understand that from less technical 'thought leaders' but I imagine Bill is still a pretty technical guy so I was kinda surprised to see that.


Facebook has announced that they want to move to end-to-end encryption for messaging, similar to Whatsapp. This would mean that any moderation being done would have to happen on the device. But a company that's promising end-to-end encryption presumably wouldn't do that.

That's kind of the point; switching to end-to-end encryption means the company is (supposedly) getting out of the moderation business, at least for these messages.

The kind of thing to worry about is social media's role in encouraging mob violence, massacres, and genocide. I assume Bill Gates is taking a global perspective on this.


Microsoft in talks to buy TikTok's operations in 5 Eyes countries only strikes me as strange. Why not their European operations as well?

Was this acquisition even on their roadmap before talks of a ban started making the rounds? It seems completely out of sync with the type of acquisitions MS makes, like Github or Linkedin.

What do MS shareholders have to say about forking over $30b or thereabouts to bankroll POTUS' China panic?


>President Trump suggesting the US Treasury will need some type of cut from any acquisition.

uhm what? Has Mr ArtOfTheDeal still not understood that he's a president not profit seeking CEO? How bizarre.


Has he heard of taxes?


I agree in the sense that TikTok is a trust and safety disaster at the moment and I’m not sure Microsoft realizes the investment required to clean it up and make it truly safe.

One of the most disturbing features to me is that you can be logged out and download nearly any video to your phone. It would be naive to think pedophiles are not actively downloading videos of minors dancing provocatively and sharing it.


Videos of minors dancing is by no means native to tiktok or in any way illegal. If you think those videos are socially problematic (which many do), then you need to attack the incentive structure that breeds that type of content.


agreed the algorithm is especially geared to minors to make mini music videos that are highly sexualized


Yes, but so is Hollywood’s, for example. The algorithm is geared around imitation style creative proliferation. It shows you videos similar to others you've liked. It simply seems like lots of people like to watch other people dance to catchy tunes. And even watch people react to watching other TikTok videos. The top videos on the platform get hundreds of millions of views, far more than what you’d expect if this content was supported by pockets of underground “pedophiles”. On one hand TikTok is an amazing and disruptive tool for creative expression. On the other hand, it simply reiterates our timeless cultural desire for sexualized female-gendered expression. If you find this aspect of human nature problematic we have also developed tools and institutions to combat it: religion. But those seem to have pedophile problems too. Joke aside they have fallen out of popularity of late.


I totally see Microsoft ruining TikTok simply because of their corporate culture. Microsoft has had an astonishing history of destroying billions of dollars in value simply because they can't manage their products better. Internet Explorer, Bing and their mobile phones represent entire industries where Microsoft has been totally decimated despite enormous advantages in terms of resources and monopolies.


I wasnt aware that the bar for having a good corporate culture was to be so utterly successful at everything you try ever that you end up with a monopoly in every single sector you ever compete in and become the most successful company in human history. Somebody should tell Nadella that the new corporate culture he is building at Microsoft that has led to their enormous turnaround in share performance and public perception is all in vain and has failed because ballamer laughed at the iPhone once.


Your premise is very obviously false on the face of it: the notion that a company can own everything, always succeed, and generally dominate every major segment it enters.

Your criticism is that Microsoft isn't a $5+ trillion company, doesn't have five plus monopolies (desktop OS, mobile, browsers, search, and a few others that would go along with those positions including advertising), and basically isn't all knowing and all powerful.

What you're projecting is impossible of course. No company could ever accomplish all of those things. What you're holding up as a supposed indication of incompetence, is in fact unavoidable: failure will happen, you will not always win, your products will not always succeed, and most likely you will fail the majority of the time; it doesn't matter who you are, or how big you are.


You realize that Microsoft is the third most valuable company in the US? It was the most valuable in 2001.

The browser wars are old. The only thing having dominance in browser market share is good for is advertising.

As far as mobile, it hasn’t been that great for Google either. It came out in the Oracle trial that Android had only made Google $22 Billion in profit from inception through 2016 (https://www.theverge.com/2016/1/21/10810834/android-generate...), much less than it paid Apple during the same time period to be the default search engine for iOS.

There is no reason to believe that the economics are any better now.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: