That's way higher than I thought. Is there any evidence? Dresden was 25,000, and the V2 and V1 campaigns had less numbers. So this is high even for an aerial bombing campaign.
[edit] I don't get why I'm getting downvoted. Are people making assumptions because I mentioned Dresden? Get a hold of yourself.
Aerial bombardments typically target areas with ~0.01 people/square meter, and those people are often in hardened shelters. A protest may have 1–4 people/square meter out in the open. Attacks targeting the latter cause orders of magnitude more casualties for the same amount of firepower.
And the crowd itself can be deadly if it gets too dense, due to panic or otherwise. For example, there have been at least two crowd collapse events with >1000 deaths in the Mecca pilgrimage.
I think it's just a stupid comparison. Aerial bombing campaign on a single city 80 years ago vs government coming down on protesters distributed in over 100 cities was the best reference you could find to doubt these numbers?
first: it's not in one day, it is over three weeks. Second, the 25000 is an extrapolation. Basically the Iran Islamic republic has a tendency to admit to 10-15% of the death toll, and they admit 3000 death.
Have you considered finding a conformal transformation† that maps a square to any other possible shape, as long as the shape doesn't have any holes? Such a transformation always exists by the Riemann Mapping Theorem, and is unique as long as you specify in addition (1.) which point the square's centre maps to, and (2.) the angle of rotation around that point. Not sure if anyone's ever tried that.
If you actually want more aesthetic freedom, you can compose with an arbitrary diffeomorphism of the square to itself. But I think that might usually look worse.
† - That is, preserving all angles, including right angles. The terminology stems from the output angles conforming (???) to the input angles.
i'm torn between my stupidity re:your first comment and my love of eliminating pinch points (now you're speaking my language). i need to read up a lot to better understand your suggestion!
(most pinch points are gone these days, it just required a lot of edge case hunting)
Always send "pragma foreign_keys=on" first thing after opening the db.
Some of the types sloppiness can be worked around by declaring tables to be STRICT. You can also add CHECK constraints that a column value is consistent with the underlying representation of the type -- for instance, if you're storing ip addresses in a column of type BLOB, you can add a CHECK that the blob is either 4 or 16 bytes.
Fun fact: The term Lingua Franca originally meant something closer to Portuguese than the French spoken at the time. Eventually though, the French language did become the Lingua Franca truly, for some time.
Alternatives to Amazon.com? I'm totally serious when asking about this. I think delivery apps (like the one comically named "Deliveroo") are all potential alternatives to Amazon, but I think they charge a premium.
It depends where you live. There's no one company that's implented in all European countries. All countries have a shop similar to Amazon (often with fewer sponsored products and less drop shipping garbage). There are also a few specialized shops (for books, sports, electronics...). Since 2020 I only buy Amazon if they're significantly cheaper than other sellers. That's about 10% of my purchases.
I think you're failing to get that using a filesystem API to work with things that aren't naturally anything like filesystems might get perverse. And standard filesystems are a pretty unnatural way to lay out information anyway, given that they force everything into a tree structure.
This is what I was trying to get at. A lot of the data I deal with is directed, cyclic graphs. Actually, I personally think most data sets we care about are actually directed graphs of some kind, but we've gotten so used to thinking of them as trees that we force the metaphor too far. I mean, file systems are an excellent example of a thing we actually want to be a graph but we've forced into being a tree. Because otherwise why would we have ever invented symlinks?
There's a bunch of literature about accessing graphs through tree lenses. I'm not sure exactly what you're looking for.
SQL certainly forces you to look at graphs as trees. Do you have an specific interface you're trying to access? If you're trying to use a graph database, why mention APIs and SQL?
I just assumed they wanted to interface with existing json over http apis rather than write their own code. The sibling of my previous comment addresses that concern.
Operating systems are where device drivers live. It sounds awfully impractical to develop alternatives at this stage. I think OP is right.
I think OSes should just freeze all their features right now. Does anyone remember all the weird churn in the world of Linux, where (i) KDE changed from version 3 to 4, which broke everyone's KDE completely unnecessarily (ii) GNOME changed from version 2 to 3, which did the same (iii) Ubuntu Linux decided to change their desktop environment away from GNOME for no reason - but then unchanged it a few years later? When all was said and done, nothing substantive really got done.
So stop changing things at the OS level. Only make conservative changes which don't break the APIs and UIs. Time to feature-freeze, and work on the layers above. If the upper layers take over the work of the lower layers, then over time the lower layers can get silently replaced.
I have never had so much negative feedback and ad-hom attacks on HN as for that story, I think. :-D
Short version, the chronology goes like this:
2004: Ubuntu does the first more-or-less consumer-quality desktop Linux that is 100% free of charge. No paid version. It uses the current best of breed FOSS components and they choose GNOME 2, Mozilla, and OpenOffice.
By 2006 Ubuntu 6.06 "Dapper Drake" comes out, the first LTS. It is catching on a bit.
Fedora Core 6 and RHEL 4 are also getting established, and both use GNOME 2. Every major distro offers GNOME 2, even KDE-centric ones like SUSE. Paid distros like Mandriva and SUSE as starting to get in some trouble -- why pay when Ubuntu does the job?
Even Solaris uses GNOME 2.
2006-2007, MS is getting worried and starts talking about suing. It doesn't know who yet so it just starts saying intentionally not-vague-at-all things like the Linux desktop infringes "about 265 patents".
This is visibly true if you are 35-40 years old: if you remember desktop GUI OSes before 1995, they were all over the place. Most had desktop drive icons. Most had a global menu bar at the top. This is because most copied MacOS. Windows was an ugly mess and only lunatics copied that. (Enter the Open Group with Motif.)
But then came Win95. Huge hit.
After 1995, every GUI gets a task bar, it gets buttons for apps, even window managers like Fvwm95 and soon after IceWM. QNX Neutrino looks like it. OS/2 Warp 4 looks like it. Everyone copies it.
Around the time NT 4 is out and Win98 is taking shape, both KDE and GNOME get going and copy the Win9x look and feel. Xfce dumps its CDE look and feel, goes FOSS, and becomes a Win95 copy.
MS had a case. Everyone had copied them. MS is not stupid and it's been sued lots of times. You betcha it patented everything and kept the receipts. The only problem it has is: who does it sue?
RH says no. GNOME 3 says "oh noes our industry leading GU is, er, yeah, stale, it's stagnant, it's not changing, so what we're gonna do is rip it up and start again! With no taskbar and no hierarchical start menu and no menu bars in windows and no OK and CANCEL buttons at the bottom" and all the other things that they can identify that are from Win9x.
GNOME is mainly sponsored by Red Hat.
Canonical tries to get involved; RH says fsck off. It can't use KDE, that's visibly a ripoff. Ditto Xfce, Enlightenment, etc. LXDE doesn't exist yet.
So it does its own thing based on the Netbook Launcher. If it daren't imitate Windows then what's the leading other candidate? This Mac OS X thing is taking off. It has borrowed some stuff from Windows like Cmd+Tab and Fast User Switching and stuff and got away with it. Let's do that, then.
SUSE just wearily says "OK, how much? Where do we sign?"
RISC OS had a recognizable task bar around 1987, so 2006-2007 is just long enough for any patent on that concept to definitely expire. This story doesn't make any sense. As for dialog boxes with buttons at the bottom and plenty of buttons inside apps, the Amiga had them in 1984.
Yes, the Icon Bar is prior art, but there are 2 problems with that.
1. It directly inspired the NeXTstep Dock.
This is unprovable after so long, but the strong suspicion is that the Dock inspired Windows 4 "Chicago" (later Windows 95) -- MS definitely knew of NeXT, but probably never heard of Acorn.
So it's 2nd hand inspiration.
2. The Dock isn't a taskbar either.
3. What the prior art may be doesn't matter unless Acorn asserted it, which AFAIK it didn't, as it no longer existed by the time of the legal threats. Nobody else did either.
4. The product development of Win95 is well documented and you can see WIP versions, get them from the Internet Archive and run them, or just peruse screenshot galleries.
The odd thing is that the early development versions look less like the Dock or Icon Bar than later ones. It's not a direct copy: it's convergent evolution. If they'd copied, they would have got there a lot sooner, and it would be more similar than it is.
> so 2006-2007 is just long enough for any patent on that concept to definitely expire.
RISC OS as Arthur: 1987
NeXTstep 0.8 demo: 1988
Windows "Chicago" test builds: 1993, 5Y later, well inside a 20Y patent lifespan
Win95 release: 8Y later
KDE first release: 1998
GNOME first release: 1999
The chronology doesn't add up, IMHO.
> This story doesn't make any sense. As for dialog boxes with buttons at the bottom and plenty of buttons inside apps, the Amiga had them in 1984.
You're missing a different point here.
Buttons at the bottom date back to at least the Lisa.
The point is that GNOME 3 visibly and demonstrably was trying to avoid potential litigation by moving them to the CSD bar at the top. Just as in 1983 or so GEM made its menu bar drop-down instead of pull-down (menus open on mouseover, not on click) and in 1985 or so AmigaOS made them appear and open only on a right-click -- in attempts to avoid getting sued by Apple.
> The point is that GNOME 3 visibly and demonstrably was trying to avoid potential litigation by moving them to the CSD bar at the top.
Well, the buttons in the titlebar at the top are reminiscent of old Windows CE dialog boxes, so I guess they're not really original either! What both Unity and GNOME 3 looks like to me is an honest attempt to immediately lead in "convergence" with mobile touch-based solutions. They first came up in the netbook era where making Linux run out-of-the-box on a market-leading small-screen, perhaps touch-based device was quite easy - a kind of ease we're only now getting back to, in fact.
[edit] I don't get why I'm getting downvoted. Are people making assumptions because I mentioned Dresden? Get a hold of yourself.
reply