Brilliant comment. Coming at this as a typographer/graphic artist and erstwhile coder, I'd bet it comes down to reverse engineering anti-aliasing algorithms. I'm not sure how it's done in Windows, but on Macs there are various levels of crispness you can set in default type as it's rasterized and if you zoom in a bit they have very clearly recognizable differences. Take the four bottom-left pixels of a capital A at 300 ppi, and compare their ink value ratios with different anti-aliasing techniques, and I bet you could get a signature of what card did the rasterization.
Gaussian blur is your friend if you wanna send a death note, I guess.
I don’t think printer drivers do anti-aliasing on text. The hardware of a printer does anti-aliasing for free.
Also, I doubt you can get conclusive evidence from a single letter. Luckily, your average random note has lot of them, even duplicated ones. I would carefully align and average out as many capital A’s as I had, and work with that.
> I don’t think printer drivers do anti-aliasing on text. The hardware of a printer does anti-aliasing for free.
You missed the explanation above why you're wrong, at least on consumer non-PostScript printers. Most cheap printers nowadays passes the buck of rasterisation to Windows (and its horrible, security headache spooler). You can even check if which is which: in Windows 10, open Settings, then Devices, select Printers & scanners, select [your name of printer], press Manage, press Printer Options (not Printing options), open the Advanced tab and then click on the Print Processor... button. If it says "winprint" then Windows handles the rasteriser.
Anti-aliasing text is of limited value at the resolutions printers can achieve. 1200+ DPI inkjets and lasers have been commonplace for over 20 years. That doesn't mean GDI variations won't influence pixels due to small numeric differences.
In that case, then it'll be purely a mechanical thing. Another thing that is still handled by the printer (unless its drivers are sophisticated, winprint isn't) is halftoning, but I'm not sure if that counts as anti-aliasing.
IDK. In the olden days, desktop printers sometimes had embedded font faces or PS1 fonts would be sent to the printer, but any vector file for large/high-res print quality had to be "ripped" or rasterized first, usually with a dedicated card. These cards definitely had signature looks and feels to them, but so did the fonts. There were differences between the way an Adobe Times New Roman would rip versus the one that came stock on your Apple IIsi.
Pinpointing a version of Windows, if it was printed from a stock OS font, could be as simple as comparing tiny differences in the vector files and knowing if one pixel would rasterize at 60% black versus 50%. To the extent that the rip goes through a graphics card, it would be knowing whether that card rendered the 60% as 58% or 62%.
I'm pretty sure if you scale it down, the printer driver will do an extra layer of downsampling and add its own anti-aliasing; but the printer hardware doesn't do that, it just sprays the dots it's told to spray, and in general the drivers replicate the pixels that are sent from Photoshop or in this case, MS Word, which uses something like QuickDraw used to be on a Mac, an embedded system process, to rasterize the fonts.
Depends on the tolerance of what you're trying to hide. If the goal is just to obliterate the way something was previously anti-aliased, or make it trigger tons of false-positives, then a small blur and not relying on the inbuilt rasterization would probably do the trick.
Prior to this it had never occurred to me. But yeah, more randomness. Noise filter and blur, then a bit more noise, then photograph it, print the photo, scan it on another device, put it in the washing machine, leave it on the porch for a week and repeat.
Mostly... It's a low pass filter, so if the information - anti aliasing techniques in this case - is concentrated in high frequency, it'll be wiped out.
Gaussian blur is your friend if you wanna send a death note, I guess.