Hacker Newsnew | past | comments | ask | show | jobs | submit | churnedgodotdev's commentslogin

You're right this person can't be taken too seriously, but the linked website for hiring cheap Philippines workers appears to be legit.


I read Donald Knuth does this. He uses a USB flash drive to transfer data to and from his main book-writing Linux workstation. Air gapping has the added bonus that the constant distraction and breakage risk of never ending app updates and security patches is mitigated. As Knuth is someone from the computing era before even floppies and network cards were a thing, when fast data transfer between sites meant mailing a deck of punch cards or reels of tape, it probably doesn't feel that slow to him. He also essentially quit email on January 1, 1990, after using it since 1975, because it was such a distraction from getting stuff done.

Also, George R. R. Martin uses WordStar 4.0 on a DOS machine:

  https://www.cnet.com/culture/george-r-r-martin-writes-with-a-dos-word-processor/


Elon 1st Amendment comment:

  Twitter acting by itself to suppress free speech is not
  a 1st amendment violation, but acting under orders from
  the government to suppress free speech, with no judicial
  review, is
https://twitter.com/elonmusk/status/1598853708443357185


IANAL but neither is Elon. His "free speech is not free reach" comment is nonsensical, for example, because limiting reach doesn't align with free speech principles. This could be another uninformed or poorly worded claim.

> acting under orders from the government to suppress free speech, with no judicial review, is

One way of interpreting that might be that Elon thinks former Twitter execs were guilty of collusion. Yet, unless there is some really nefarious smoking gun where both parties were grinning about screwing over the public, I believe the most likely guilty party, if such events are found to have occurred, would be the government, not former Twitter employees.

I'm not sure if that's what Elon meant to imply though.


Seems like Godot already does what Blender Apps does much better:

- Godot is great at importing Blender models/scenes.

- Blender is GPL, Godot is MIT.

- Godot produces much lighter .exe files. Blender Apps are at least 200MB.

- Godot has web export, too.

Godot still has stability issues that can affect larger projects (presently over 7000 open issues on Github[1]) but for smaller demos and art things Godot usually is pretty solid.

[1] https://github.com/godotengine/godot/issues


Blender apps as described here seem to be for an entirely different class of use cases. Basically stripped-down versions of Blender itself rather than games, art or whatever. Blender used to have a "game engine" that was more like what I understand Godot to be, but IIRC scrapped it.


> Basically stripped-down versions of Blender itself

Okay, that makes sense. The video on the linked Blender Apps page shows this really simple app where one drags models around and then drags materials onto the models to give the model that material. So something simple like that would work better in Godot. But Blender's 3D modeler is way more powerful than Godot's but has a very steep learning curve so distributing a simpler version of Blender's modeler could definitely make sense.


> "gen z" students expect the "system" to adapt to address their struggles

I hope Gen Z soldiers don't have this attitude.

[Mortar round blows off leg of Gen Z platoon grunt.]

"OMG! Mortar rounds are SO UNFAIR!" Let's sign a change.org petition demanding mortar rounds only emit harmless pink smoke. If we end up with pink smoke in our foxhole we can demand more time to dig a deeper, better concealed foxhole.


They could also be the type of people that get chemical/bioweapons/newer weapons banned from international war, as other pink smoke mortar loving people have done in the past.


One of the reasons chemical weapons were banned was because they were ineffective, and had the unfortunate tendency to blow the wrong way over the lines.

Bioweapons are also just as likely to kill your own side.


Noble thought but unlikely. With the end of WWII and the formation of the UN it was hoped that the development of newer and more terrifying weapons would stop but it was a false and unrealistic hope. I'm of the Woodstock generation and we thought the world was in for a much better age but tragically that never eventuated, wars never stopped and they're now escalating again in serious ways.

One hardly needs to study history to realize that it's mainly the study of wars, weaponry and people killing each other. It's taught us that civilization has always cycled between a state of fragile peace that's constantly within a hair's trigger from ending and actual war itself and descent into barbarism—and the longer the cycles of peace last the more sensitive the hair-trigger becomes.

Few would be happier than me to see this or the next generation succeeding as you suggest, but on the evidence I'd reckon the odds of success are extremely slim—so slim in fact that I'd put substantial money on it. As success would mean a complete turnaround in what's happened throughout all of human history—and as the unprovoked war in Ukraine has shown, we are not yet even on the starting block!

I'd also suggest that come any serious global conflict, treaties outlawing chemical and bioweapons etc. will be violated on a whim and the evidence is that it's already happened in comparatively small conflicts—not to mention Putin's sabre-rattling threat of not ruling out the nuclear option (and he's done so in what is still just a local conflict, albeit a serious one). What would be worse, the nuclear option or the other big two?

(As I write this my conviction is being further strengthened—I've just watched back-to-back television footage of the latest bombing of an apartment building in Ukraine where people were killed and the killing of dozens of innocent children in Thailand by of all people a parent!)


ASML is the closest thing to the planet Dune in our world. Chips made with their machines extend life. Chips made with their machines expand consciousness. And he who controls the chips, controls the universe. The information universe, at least. I just wish for a messiah to arrive who leads the jihad against FAANG. The Kwisatz-RISC-V-Haderach. The Voice from the Open Source World. Fortunately, he need not fell Mark Zuckerberg in single combat with crysknife in order to lead our people to true freedom.


Ada solved these problems in 1983.

More recently, you can use Frama-C to constrain allowable sequences of 0's and 1's for C types and formally verify correctness.

In Ada since 1983 you can, e.g, declare your own 8 bit signed symmetric type without the wart -128 like so:

  type Sym_8 is new Integer range -127 .. 127;
Then this fails at compile time:

  My_Signed_Byte : Sym_8 := -128;
  
SPARK can prove all execution paths through your program are free of such constraint violations. This means safe SPARK code can disable runtime checks and run faster than the safest Rust/Zig dev settings, which insert runtime checks for over/under flow.

In Frama-C, say you want a function that returns an absolute value. This function will fail to verify:

  /*@ ensures (x >= 0 ==> \result == x) &&
              (x < 0 ==> \result == -x);
      assigns \nothing;  */
  int abs (int x) {
      if (x >=0)
          return x;
      return -x;
  }
It fails to verify because you might have x==INT_MIN. So this will verify:

  #include <limits.h>
  /*@ requires x > INT_MIN;
      ensures (x >= 0 ==> \result == x) &&
              (x < 0 ==> \result == -x);
      assigns \nothing;  */
  int abs (int x) {
      if (x >=0)
          return x;
      return -x;
  }


For the sake of completeness: Frama-C (+ WP plugin) can verify that the program conforms to the first specification, however, it indeed can't verify that the program does not contain an undefined behavior (reason why we write the second specification).


How does this work in presence of user supplied values?

It has to have a check somewhere, no?


Of course. The checks are performed where the values arrive. If a file or network socket provides a number N that is going to be used as an array index, then N must be proved to be within array bounds at compile time before being used as an array index, which means you are forced to catch the bug at compile time and decide how to signal bad values of N.


Ok, but you can't check _runtime_ values at _compile_ time. If you have user input/file system/network supply N (where 0<N<100) you still HAVE to check it somewhere. You still have to have a runtime check when interacting with outside world.


Of course, that's what I thought I said in my previous comment but I guess I wasn't crystal clear. The formal verification stack basically tells you, "There's no guarantee 0<N<100 holds if you get N from an I/O stream. N can be anything." So it's your job to insert a runtime check for N<=0 or N>=100 and decide how best to handle that, e.g. panic and quit, or clamp N to a valid range if it makes sense, or avoid accessing the array if it makes sense. But the system won't let you use a possibly out of range N as an array subscript so you are forced to code a solution to this special case "at compile time", typically by inserting a runtime check. Though if you are reading data from a trusted ROM you can also use a pragma or special assert to say, "This data can be trusted to have this invariant" and avoid the runtime check.


The cursor lag is really bad, particularly on Windows, but there are at least 2 ways the lag could be much lower:

1. Use 'desynchronized', e.g. canvas.getContext("2d", { desynchronized: true });

2. Instead of using requestAnimationFrame(), draw as soon as a move event is received.

Here's a paint app that uses techniques 1 and 2 to achieve much lower input lag drawing the pen strokes that chase the cursor:

https://paint.js.org/

Two other tricks to be aware of:

3. [Chromium only.] Use 'pointerrawmove' to get move events as soon as they happen instead of waiting for a big bunch of coalesced moves to all arrive after a delay.

4. Drawing where the cursor is predicted to be in order to compensate for lag works well during smooth cursor/pen motion. For example, the Windows Snip & Sketch tool actually renders the line you are drawing in front of the direction your pen tip is detected traveling because when using, say, a Microsoft Surface Pen on a physical Surface Pro screen the line being drawn would otherwise noticeably lag behind the physical pen tip (especially on a 60Hz screen) unless this predictive trick is used.

Finally, be aware of open Chromium issue #992954. When the devtools are open [F12], coalesced events aren't used so you get lower input lag but more events to pump (like 'pointerrawmove'). This bug is maddening because it depends on whether or not you have the debugger open!

https://bugs.chromium.org/p/chromium/issues/detail?id=992954


No lag on my 6 year old iPhone. (I wasn't even expecting the site to work on it!)

Thanks for the tips, I make interactive art and creative tools these will come in very handy.

On a related note (input lag), I noticed in my high speed recordings that even programs with no input delay (mspaint) lag 1 frame behind the mouse cursor, something to do with double buffering (and the cursor being rendered separately by the GPU?) Is there any way to get around that on Windows, eg. in fullscreen?

I.e. instead of predicting where the cursor will be to compensate for the fact that everything is rendered 1 frame late, can we just make it render on time?


> can we just make it render on time?

Oh, time to go down a low-latency rabbit hole. Start here:

https://raphlinus.github.io/ui/graphics/2020/09/13/composito...

Some key mentions: Windows has a hardware overlay API. A hardware overlay is why the OS mouse cursor renders with low latency even with 'vsync on'.

For fullscreen, the ultimate in tear-free low latency is beam racing. Search for 'blur busters beam racing' to go down that rabbit hole.


Actually, a reason (of many) I switched from Apple to Windows was Apple's well-known audio drop-out fiasco caused by Apple's T2 security chip essentially hijacking data lanes when low-latency audio processing needed them. Here is an image of an audio capture of the T2 causing a hiccup:

https://tidbits.com/uploads/2019/04/T2-hiccup.jpg

This image was linked from this article:

https://tidbits.com/2019/04/05/what-does-the-t2-chip-mean-fo...


Since many HN'ers live in the greater Seattle area, it's important to be aware of the Tacoma Smelter Plume:

https://ecology.wa.gov/Spills-Cleanup/Contamination-cleanup/...

For almost 100 years, the Asarco Company operated a copper smelter in Tacoma. Air pollution from the smelter settled on the surface soil of more than 1,000 square miles of the Puget Sound basin. Arsenic, lead, and other heavy metals are still in the soil as a result of this pollution.


And for anyone in the East side of Toronto, there used to be a lead smelter that polluted the whole neighborhood - http://laurajones.ca/wp/?p=515


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: