"The study’s methodology is somewhat questionable, since it counts as French-speakers all the inhabitants of countries where French is an official language, which probably won’t be the case. And almost certainly, as a second language, English will remain the lingua franca"
Sounds like the study measures colonial vestiges and not real world usage.
Same. Flux is really simple and most of the implementations just remove some boilerplate and add some conventions. Given more time there might be a framework that has everything that I want, informed by comprehensive, mature, real-world use cases, but I haven't yet seen anything compelling enough in the dozen or so libraries I've looked at. I've found a lot of value in having granular control so I can explore different conventions and optimizations.
> The official name is io.js, which should never be capitalized, especially not at the start of a sentence, unless it is being displayed in a location that is customarily all-caps (such as the title of man pages.)
One of the things that always bothered me about interfaces was the inability to define a default implementation, especially when developing UIs. A proper mixin system (like what's used by React's components) has solved this particular problem for me.
I agree with you about Haskell. I almost went into a discussion of how Rust does this. They don't have subtypes, but they do have code reuse if you explicitly declare it. Caution, I'm just an admirer of Rust, not a user. So maybe it sucks too, but this seems closer to how things ought to be.
The only gripe I share is scoping, and that's fixed in LiveScript. I've started writing LiveScript in a fairly explicit style - for example I use parentheses where they improve readability and I always use curly braces for object literals. Qualitatively I think it's increased the readability of my code way beyond JavaScript. It's nice to have curly braces carry specific meaning instead of being littered all over the place.
Not sure what you're referring to, re: 10 million units. They haven't sold anywhere near that.
Currently their price to sales ratio is closer to 150. Facebook paid a massive premium no matter how you cut it. Even if they grow sales ten fold over the next three years, they'll still have paid a premium. I don't think the Oculus purchase can be justified in any near-term financial regard, it's a very long term bet.
"Oculus has said it will start selling a consumer product in 2014, and till now has shipped an impressive 60,000 units to developers at $350 each"
I qualified it as 0.666x projected sales (intentionally not rounded >:), so yeah that 10M was a projected figure, which unfortunately I can't find the source for. By the end of 2015 I think Oculus will look like a steal, unless the goodwill is shot.
2-3x profit, right, not revenue? 10m units at $300 would not land anywhere near 3 billion in profits once you get to the bottom line. Lots of licensing fees coming up though.
Generally the starting point for any tech acquisition is 3x - 5x revenue multiplier. Occulus is hardware so their margins are lower than software, pushing them closer to the 3x mark.
Sounds like the study measures colonial vestiges and not real world usage.