I remember that PIC project. I don't know if source was ever released, but I recall a lot of folks being very dubious about the claims made.
Quote:
The PIC has 1024 words (12-bits) of program ROM,
~256 bytes contain a hand-crafted RFC1122-compliant implementation of TCP/IP including.
HTTP/1.0 and i2c eeprom Filesystem, using 3 to 99 instructions.
TCP and UDP protocol stack, using 70 to 99 instructions.
ICMP [supports upto ping -s 11], using upto 14 instructions.
IP - Internet Protocol, v4, using 68 to 77 instructions.
SLIP - Serial Line IP packetisation, about 76 inst
Fully buffered UART, upto 115200 bps, using 38 to 56 instructions.
Operating system: RTOS with Rate Monotonic Analysis, using 3 to 15 instructions.
The problem with shebangs that I really want to solve is that I often need to edit files on a Windows machine, and then try to run the resulting CR/LF scripts on a Linux machine using a shared filesystem. (Docker, WSL, Vagrant etc...).
I'd like to invoke them with just (eg.) ./dostuff.py - Python, Ruby etc... have absolutely no problem running files containing Win-style line-endings. The only issue is that /usr/bin/env complains that it can't find (eg.) "python3\n".
Yes, I know I could convert these files with dos2unix, and I also know that I can just invoke the interpreter explicitly - but I'm lazy, and this seems like such a trivial thing to solve that I can't believe it's not been done already.
I've taken to actually recompiling /usr/bin/env to strip trailing whitespace from the executable name - but there must be a better solution.
There are editors that allow preserving the line endings (be they LF, CRLF or CR) from the source file. That could be a solution since you already seem to have control over your environments.
From personal experience, I've always had better results from Simulated Annealing than Genetic Programming. I hasten to point out that it's entirely possible I've been doing GP poorly - but part of the problem seems to me that GP has so many more knobs/dials/parameters that need to be tweaked correctly in order to yield good results.
Are you referring to GA or GP? GP is not comparable to simulated annealing, as it's kind of non-parametric.
Many ideas of GP are getting reused in Bayesian program induction, in conjunction with differentiable programming, SAT solvers, etc [1]. IMHO a very promising route to AGI.
These are orthogonal things. Simulated annealing is an algorithm. Genetic Programming is pursuit, which can (among other things) use simulated annealing to achieve its goal. Are you sure you're not confusing this with something else? Did you mean Koza-style GP perhaps? Or did you mean a genetic algorithm (GA)?
Because I wanted a strongly typed language that among many good things provides the opportunity for better error messages. Also, the AngelScript compiler is 100% embedded in the application and does not rely on an external environment like Python does. This makes it very easy to install and use AngelCAD.
I know Python is popular, and there is nothing to stop you from developing an alternative front end in Python or any other language of your liking, and still use the AngelCAD boolean engine, just implement the xcsg protocol https://github.com/arnholm/xcsg/wiki
A better question would be, why yet another C knockoff on top of XML bloat? It just seems so uninspired. 3D drawing ought to be a slam-dunk for a dedicated DSL: pure functional composition (objects described in terms of relationships with each other), homoiconic (so code can generate code directly; no need for XML middleware makework).
Plus pure FP code should time-independent in its evaluation, so it doesn’t take a genius to imagine the benefits of (safely!) introducing a time axis on top of that. Procedural generation, animation; great potential there.
This is not all idle speculation, BTW: I wrote a semi-declarative DSL for 2D artwork production that blew its traditional imperative rivals out of the water while also being easy enough for non-programmers to use. So I know there’s potential there, and I’d be fascinated to see what kind of language someone with a mind for 3D math could shape for the task. Plus as a trained sculptor turned automation junkie who can’t stand GUI-based 3D drawing apps (they just don’t fit my head), I’d love to try something with a language-based interface instead.
My team and I run the servers for a number of very big videogames. For a high-cpu workload, if you look around at static on-prem hosting and actually do some real performance bencharking, you will find that cloud machines - though convenient - generally cost at least 2x as much per unit performance. Not only that, but cloud will absolutely gouge you on egress bandwidth - leading to a cost multiplier that's closer to 4x, depending on the balance between compute and outbound bandwidth.
That's not to say we don't use the cloud - in fact we use it extensively.
Since you have to pay for static capacity 24/7 - even when your regional players are asleep and the machines are idle, there are some gains to be had by using the right blend of static/elastic - don't plan to cover peaks with 100% static - and spin up the elastic machines when your static capacity is fully consumed. This holds true for anything that results in more usage - a busy weekend, an in-game event, a new piece of downloadable content, etc... It's also a great way to deal with not knowing exactly how many players are going to show up on day 1.
Regarding latency, we have machines in many smaller datacenters around the world. We can generally get players far closer to one of our machines than to AWS/GCP/Azure, resulting in better in-game ping, which is super important to us. This will change over time as more and more cloud DCs spring up, but for now we're pretty happy with the blend.
I discovered that my son (17 months old at the time) loves to mess with stereo controls. So I bought a few rotary encoders and neo-pixel rings - build a wooden enclosure with a plastic faceplate, and wrote some code to generate fancy light and audio effects when he turns/clicks the knobs. He loved it. We call it the "Max Distractor".
I recently needed to encode a 32-bit value into something easy for QA folks to remember and report. I opted for 3 words out of an 11-bit (2048 entry) dictionary of commonly used words.
How to build the dictionary? Well, in order to determine the most commonly used English words, I downloaded a bunch of free texts from Project Gutenberg, and did some simple filtering - nothing less than 5 letters, no duplication of singular + plural, etc...
A valuable lesson that I learned during this process is that when your corpus includes older english texts, you should always give your final list a visual once-over and apply some judicious manual filtering. I'm looking at you, "The Adventures of Tom Sawyer". (And, to a lesser extent, Moby Dick).
Or use the BIP39 lists since they also encode 2048 bits. If you just use BIP39 you also get a checksum. RFC 1751[1] is the "standardised" option but IMHO the wordlist they use is far too easy to misread (though this is because the words are all less than 4 characters).
I have a Sonos that came with my house. It's actually the least of my worries.
The house was built about 10 years ago with what (I presume) was a state-of-the-art system at the time - an "AudioAccess WHEN" system. It works fine - there are keypads and speakers in every room, and I can pipe audio from the Sonos (or an Airplay receiver) to anywhere.
It's a weird topology, however - the speakers in each room are wired to the keypads (which is where the amps live). Each keypad has a power connection, and some kind of (presumably proprietary) Cat-5 connection to a central hub. The hub in turn is connected via Cat-5 to a head unit with FM receiver, CD/AUX inputs, etc...
When we moved into the house, the head unit wasn't working - it refused to establish a connection to the hub. I managed to track down a working tech support phone number, only to hear that they don't make this system any more, and that the head units often fail in this way. I managed to find what may have been the last replacement head unit in existence on Ebay - bought it, and fortunately everything started working!
I am, however, dreading the day when it inevitably dies. Since the speaker wires go to the keypad amps, and not to the wiring closet (where the hubs live), I'm not sure what I could replace it with - beyond re-running new speaker wire to a completely new system in the wiring closet.
If there's cat5 at the keypads back to the central hub, and the speaker wires go to the keypads, then you can just hook the wires from the speakers to the cat5 - choose one or two pair (depending on the distance, you might want to "double up" to lower the impedance over the run), and hook 'em up. Then you just need to figure out a distribution system at the central hub location.
If you find you can run the speakers on a single pair per speaker - that will leave you with 2 other pairs on the cat5 - which you could use for control or communication.
But what I would do before all of that is try to reverse-engineer the protocols or whatnot that the whole system is currently using, so you can keep the keypads/amps and such, and create some kind of custom main "head unit" later.
This one is fascinating - especially the slides from the JavaOne presentation. It JITs heavily-used R3000 blocks into Java bytecode, which is then JITted by the Java interpreter. And he did it 14 years ago!
Quote: The PIC has 1024 words (12-bits) of program ROM, ~256 bytes contain a hand-crafted RFC1122-compliant implementation of TCP/IP including.
HTTP/1.0 and i2c eeprom Filesystem, using 3 to 99 instructions. TCP and UDP protocol stack, using 70 to 99 instructions. ICMP [supports upto ping -s 11], using upto 14 instructions. IP - Internet Protocol, v4, using 68 to 77 instructions. SLIP - Serial Line IP packetisation, about 76 inst Fully buffered UART, upto 115200 bps, using 38 to 56 instructions. Operating system: RTOS with Rate Monotonic Analysis, using 3 to 15 instructions.