I had listened to an interview with the original cleartype author on a podcast almost a decade ago now.
IIRC He said that when Apple got the cleartype patent (due to the Apple/Microsoft cross patent agreement that happened in the late 90s), they kinda messed up the implementation of cleartype and Windows actually implements it correctly and Mac OS X (now Mac OS) doesn't.
I think this is something that every university graduate needs to understand so as to not be bitter.
Your first years of your career will be shit, there is just no beating around the bush around this fact. You will be treated like a kid, you will be paid like shit (borderline minimum wage a lot of the times), and you will have to work ridiculous amounts to be seen on the same level as other people in the field.
Once you break the 1-2 year experience mark then you can begin to expect entry level benefits for entry level positions. After you break the 5 year experience mark, then really the world is your oyster. 10 years or more and you can demand blank checks.
Have you thought though that if more money goes into Solar, Wind/etc (via BTC, Eth mining and similar) that more money goes into renewable energy and thus more money can be spent by those companies improving their existing offerings or researching new tech?
A lot of the time Early adopters end up essentially funding technology that would have investment otherwise.
With Bitcoin mining, that money comes with the massive carbon emissions driven by the huge energy demand. The alternative, governments subsidizing solar/wind/etc. deployments, comes without the massive energy demand and lets grid operators shut down their carbon-emitting plants.
Like I said, the fact that people are bringing carbon-emitting power plants online to power Bitcoin rigs speaks volumes about this line of thinking.
The problem is with many with your attitude is that everything must be done perfectly now. This is the attitude of an abolitionist. It isn't perfect therefore it must be banned.
There is money to be made mining Bitcoin and people will mine it (illegally if need be). It is better that if this funds renewables (they claim to be paying for offsets according to the clip and investing in Solar) now even if it isn't perfect (The plant itself seems to be better than the Coal powered plants in China).
This is a broken windows fallacy that is often repeated on reddit and HN. It makes about as much sense as all of us buying solar panels and leaving our refrigerators open all day.
I didn't know what the broken window fallacy was. So I looked it up. It is rather dubious if what I said fits into this fallacy.
If I leave my refrigerator open all day, it is guaranteed it will produce nothing of value. In this case, it is more dubious. They are producing a good that people want to buy. You might think Bitcoin is worthless, but that is besides the point, other people think it is worth it. Since the benefit vs the energy usage is completely subjective it isn't as clear cut as you like to make out.
My point was that people spending time doing things that seem pointless at the time can sometimes benefit everyone in the future in unexpected way. You, nor I can know what the future holds.
>If I leave my refrigerator open all day, it is guaranteed it will produce nothing of value.
This isn't true. I can put the back of my refrigerator in my doorway and use it to cool my house. "Leaving my refrigerator open all day isn't wasteful, because it cools my house and I use green energy to do it." This is essentially the logic used in this thread; it is a complete non sequitur.
I suppose one could say that whether or not it's wasteful to cool my home with my refrigerator is subjective, but most would say that's just being obtuse.
> OP's statement leads straight to "they're out to get me," which is absolutely not helpful, and it's absolutely an indicator of a difficult hire.
Not at all. If you've been out of work for a year and seeing your savings (if you have any) dwindle and vent on what is an internet comment sections, doesn't indicate how you are in person at all.
> If I'm interviewing someone and the person is demonstrating a victim narrative, that is a huge red flag.
You are ignoring the present situation entirely. Governments have put everyone's life on hold for well over a year now (we are in month 17). Statements from authorities have been contradictory, non-sensical, they have lied in some cases and some have broken their own lockdown rules (e.g. in the UK Matt Hancock which was in charge of public health IIRC was exposed as having an affair during COVID, which BTW was illegal under the lockdown rules).
So many people can see it for what it is. One rule for them and one rule for the plebs.
> Similarly, the inability to see shades of gray (e.g. "using MySQL is always utterly stupid, you should never use it in favor of Postgres") which OP demonstrates with their COVID statements, makes a person virtually non-employable in my book
You are reading far too much into comments around COVID due to your personal bias (which btw is obvious here). Because you disagree on a particular issue doesn't not indicate someone's thinking in a different field of expertise. e.g. there are many great scientists that believe very deeply in Religion. Which as a non-believer I would think would be at odds with one another.
I have personally found it very difficult to find a job during COVID as well. I am almost 40 now and it worries me that I might experience the same in the future.
In addition to Matt Hancock, there was also the case of exemptions from quarantine for UEFA football VIPs. [1]
I think it's right the UK is returning to personal responsibility, especially as many of the rules haven't seemed to be science-based e.g. you must wear a mask entering a pub, but you are allowed to remove the mask while eating seated.
It's health-theatre, rather than virus prevention.
There were many notable people who were exposed as to exposing that we all should be lockdown while breaking the rules themselves. It is quite frustrating when I live in an area where almost everyone followed the rules.
Yes there is a quite an element to theatre to the whole thing, which makes sensible discussion about the issue impossible. Which I believe is somewhat by design.
The basic takeaway is that you make it impossible for the average person to keep up the thread of events and they become apathetic to it. I've also watched Adam Curtis's "Hypernormalisation". It is well worth a watch.
The fact was that I wasn't right. They were doing decent work and their beliefs didn't affect their work.
A lot of non-believers (in the past I would have include myself in that list) seem to believe because they don't believe in a higher being that they are somehow more "rational". Nothing could be further from the truth. It took me a long time to realise that I wasn't being more rational than the faithful and I dogmatic about things that were simply quasi-religious. It took a lot of introspection and several times I had moments where my ideology hit reality hard and I spent several days dealing with cognitive dissonance and having to accept I was just wrong.
Indeed, but I think that was the point. It's entirely possible to hold irrational beliefs in one area without them affecting your work in another area.
Whether you think it's at odds or not, in practice many great scientists have been devoutly religious. For example Newton.
I was unemployed for 9 months in the UK. I was a contractor/consultant and had to take a full time position.
When Lockdowns started I was literally finishing a contract and was happy to have some time off. After month 7 of not having any work and eating into savings and constantly lockdowns meant I had to go full time.
Interview process was frustrating to say the least. Lots of pre-screen "tests" which some were two to 3 hours of asking computer science style questions that are irrelevant for web development.
It was a frustrating process that was exasperated by COVID.
It called progressive enhancement. You test whether you have a feature and then code around it if you don't.
e.g. When createObjectUrl was a new thing. I ended up doing something using AJAX and doing some processing server side for IE and Edge. Now I could have done all the processing server-side. However 80%+ of the users use Chrome. So instead of checking everything on the server, I could offload 80% of that to the browser.
"Coding around it" means handling an additional error state, which can be in varying degrees of complexity depending on what the feature is doing for your app.
Yes, obviously.. It really depends on how much time you have, what browsers they care about supporting etc.
There is a tooling out there to assist you and most of the compatibility issues are well documented now. 10 or so years ago when I started that really wasn't the case.
> We do see (3) with Typescript and Coffeescript. These create some debugging friction. Also as JavaScript gets marginally better the appeal of these other languages can feel more niche, and many people just use the common denominator.
I have no idea with Coffeescript (I've read about it, but never used it). But my current job is basically spent all day writing TypeScript.
1) There is no debugging friction. I can debug TypeScript directly in my browser.
2) It isn't marginally better over JavaScript. It is leaps and bounds better than JavaScript.
TypeScript compiler will catch many, many, many common problems straight away (provided it is configured correctly). I have written a lot of JavaScript and I don't enjoy going back to having to write regular JavaScript.
The friction with typescript is the build environment (speaking as someone who also prefers it, and uses it all day).
If someone else is handling that work for you (and in most companies this does essentially become a core responsibility for a few critical maintainers) then yes - Typescript feels great.
If you have a small, single person project - there's enough real overhead there that I'm not sure typescript is the right initial choice.
> The friction with typescript is the build environment (speaking as someone who also prefers it, and uses it all day).
You can literally do:
tsc filename.ts
And you will have a javascript file that you can run in a browser. No configuration files needed, just need to install the typescript compiler. Sure if you need more complicated things. You will need a more complicated setup. But it is no worse (and in someways better) than most of the ecosystem.
> If someone else is handling that work for you (and in most companies this does essentially become a core responsibility for a few critical maintainers) then yes - Typescript feels great.
I don't know where you work. However I've done this myself by myself. It is well documented and it isn't that difficult. I think the hardest to setup from scratch is webpack, but Gulp is fairly straightforward.
I've have set stuff up completely from scratch and created my own framework with TypeScript (I was bored during lockdown) and it takes maybe a day to figure out with webpack.
Not something for a beginner. However you don't need a whole team to do this or specific people looking after it.
> If you have a small, single person project - there's enough real overhead there that I'm not sure typescript is the right initial choice.
The same could be said of any other language where you need any sort of build system and a compiler.
I find it hard to believe that this was a spontaneous event in light of previous events.
It worries me greatly that speech could be compelled. In the UK a very similar case was brought against bakers here and there was a decisive decision in favour of the baker.
It's quite interesting because conservatives in the U.S. are simultaneously attempting to compel tech companies to host content against their will but at the same time they are trying to argue that these cake / flower / website makers don't need to make content against their will.
The difference is that the tech companies between them control a huge amount of online communications. The bakers, florists and small web-developers don't.
> The stupid RGB LEDs can be disabled in the BIOS/firmware menu by setting “AURA” to “Stealth Mode”, and I unplugged the two 1” fans on the I/O board to keep everything silent. I also had to set the “CPU Fan Speed” in the “Monitor” section to “Ignore”, or else it would indicate a fan error at every boot since I had none plugged in.
I can't tell looking at the motherboard he linked whether it has a fan there and which 1 inch fan he is talking about.
However I have a ATX board that is very similar and the little fan is there to cool the northbridge and it *should* be running. Northbridge's on these chipsets get very hot from what I understand.
An X570 board is indeed not a good choice for a build like this, because the I/O options offered by the chipset are not actually used. This particular one seems to have one fan for the VRMs, which is likely not needed in his use, while the NB will actually need that fan (with the stock cooler, if you replace the usually not very good OEM designs you can cool X570 passively).
A B450 or B550 board would have been a better idea and much cheaper to boot. It would also significantly reduce power usage, as just the X570 chipset alone needs something like 8-9 W (at idle) more than the B450 chipset.
Indeed, I bought a b550 board specifically because it doesn't require a chipset fan that's tiny, inefficient, and much better at making noise/vibration than it is at moving air. Then tend to be rather unreliable as well.
There is a newer x570s motherboards out now, if you need the highest end boards and don't want the extra fan.
Thanks for pointing that out - I've been nosing around as I consider what I want to build next, and I hadn't realized the X570s tend to need chipset fans. That's a deal-killer for me. Either it's going to be noisy as all get-out, or it's going to be some cheap part that fails prematurely, assuming it isn't both. I definitely prefer big and slow for fans.
The new crop of X570 boards, labeled X570S or something, do not have a chipset fan. (Also on the original ones, the fans might've had full silent modes where they only spin up when pushing big PCIe bandwidth over the chipset?)
IIRC He said that when Apple got the cleartype patent (due to the Apple/Microsoft cross patent agreement that happened in the late 90s), they kinda messed up the implementation of cleartype and Windows actually implements it correctly and Mac OS X (now Mac OS) doesn't.
So I find this somewhat amusing.