Author here - the reason is pretty underwhelming: These are the parts I had on hand when I decided to start on the project. Using a chip with a builtin Ethernet peripheral would definitely make more sense (though I'd be trading any complexity of configuring the W5100 for the complexity of configuring STs peripheral). The networking code already abstract the actual chip itself into a driver interface (think read/write/ioctl), so porting the code would be pretty straightforward.
I'll look into the STM32F407 for the main series. Thanks
> though I'd be trading any complexity of configuring the W5100 for the complexity of configuring STs peripheral
It's not too bad as far as such things go. The documentation on how the DMA system works leaves something to be desired, but it's not bad (and it's a heck of a lot faster than spitting packets over SPI).
If you can only capture 60FPS, you cannot accurate measure a signal of 40Hz, since it is below the Nyquist frequency. (I know you mentioned the slow-mo camera which probably does have the bandwidth).
Frequencies beyond the Nyquist limit alias back to resolved modes. So a combination of the camera plus a human eye to check that the flashing is “fast” (i.e. above the Nyquist limit) should be enough to verify the frequency.
You technically can measure over the nyquist frequency, for example here if you measure 80Hz with your 60Hz camera, you can be sure that either you really did the math wrong with the intended 40Hz, or it is actually the nyquist shadow frequency of your 40Hz oscillation!
Of course, you’re right about the Nyquist frequency (how could I forget the Nyquist-Shannon Sampling Theorem!). But I looked it up and the iPhone 15 slow motion mode can do up to 240 fps which is enough to easily measure it super accurately with a short sample.
This is a very first-world opinion. Low-end (second hand) smartphones are some of the only available computing devices available for a large part of the world.
> There's something more constricting about there being one function to bootstrap everything than there is about one file.
The trickiest thing is that main() is not even the bootstrap function. The actual entry point of a program is usually generated by libc, and is called generally called _start (though it can be anything).
> ...remote biometric identification systems in publicly accessible spaces, biometric categorisation systems (e.g. categorizing by gender, race, ethnicity, citizenship status, religion, political orientation) and the use of AI for predictive policing.
> AI systems which can influence voters in political campaigns and by use of suggestion systems on very large platforms...
> New transparency and risk assessment requirements for providers of (generative) foundation models like GTP.
> Clarified exemptions for research.
Putting these kinds of restrictions in place is absolutely a good thing. While they might not get everything right, this is a step in the right direction. Our laws and understanding as a society has been lagging behind technological development for decades now. That fact has enabled a large amount of exploitation to take place, which has (in the last decade especially) had a large hand in massively undermining our democracies.
>New transparency and risk assessment requirements for providers of (generative) foundation models like GTP
This is absurd. For a relatively small sum in the grand scheme of things, I could rent a few A100s, download a free dataset and train a model like LLaMA 30B, which is comparable to GPT3 (and indeed there already are such efforts popping up). Such a law could potentially make it illegal to upload such a thing if you live in the EU without going through a potentially expensive and bureaucratic process. It will completely stifle AI development the same way requiring people to going through a bunch of paperwork to upload a new library would stifle web development.
This is the most use that board has and would have gotten in the last 10+ years. I like to think it laying down its FTDI chip for a greater cause was an honour for it.
Indeed - I know I could have also popped out the DIP micro on the board and used it directly, but I choose not to for two reasons:
1. Its so much bulkier, and I wanted something that was the same form factor as my old unit
2. It seemed way more fun to do things this way, and I was pretty sure I could do the whole operation in less than an hour
Clearly you should have compromised and hacksaw'ed out the usb jack + ftdi region of the board, thus satisfying both your desire to have something jankey and compact. You would even get free activity LEDs!
Totally fair point. One of the videos I'm going to get around to making on my channel at some stage is a UART transceiver on an FPGA. Definitely not the "easier in a pinch" I was originally going for, but hey, if you've got the hardware hanging around!
I didn't mean for that to come across as a criticism, I was disappointed because I was hoping to learn something new, but I'm not the target for that post; I have a degree in Electronic Engineering and a couple of decades of tinkering.
I'll be interested in some FPGA content; I built my dissertation project on FPGA and would love to see some modern, affordable FPGA boards/circuits with useful I/O to actually build something like this. I'm sure they exist, I've just never looked.
Thanks for the interesting read and I'll keep an eye on your repo!