A lot of landlords don't seem to really understand that their rental properties are a business and an investment, and thus inherently have risk. Fewer still really comprehend that the little landlords generally are carrying a lot more risk than the bigger property management companies. They just keep being shocked and offended when they seemingly discover this risk for the first time all over again.
I think you're right, a lot of people underestimate just how risky the real estate market is. Back in the 90s it was seen as a low-risk endeavor. Pretty much guaranteed money and tons of cheap leverage to max out returns. After all, everyone needs houses and the population just keeps going up faster than houses could be built. Especially with immigration.
However the cheap leverage and massive influx of foreign investment kinda fell off which made prices drop like a rock in 2008. I wouldn't be surprised to see it happen again in late 2020 and 2021 when a bunch of people see balloon payments from deferred mortgages.
Only for personal usage, businesses may still need to show licenses and with telemetry, it’s easier for Microsoft to find such usages. That said, with the switch to Linux servers, Windows is probably a declining business for Microsoft while Azure and Office are growing, hence “Microsoft 365” subscriptions.
I believe for non-production usage, Microsoft is okay with unlicensed or trial Virtual Machines and such. The idea being that if developers are testing their software in Windows, that’s better than the alternative (not testing their software in Windows).
Similarly, if willing to beta test versions of software, you can get Visual Studio for “Insiders” and test the Enterprise edition for free with a non-production license for as long as newer beta versions are made available. (Note: IANAL, this ain’t legal advice, read the terms and conditions on any Microsoft VMs or software installations)
There is a permanent Windows Activation message (white text on transparent background) in the lower right part of the screen which my brain already blocked so it's not distracting at all. I only notice it after taking screenshots.
The problem is that the drug doesn't stay at an effective level in the body for 40 days (it also takes more than a month for the drug to take effect), and the consequences of allowing lupus to flair include accumulating permanent damage to, for example, heart, lungs, kidneys and joints. Additionally, there's no good information yet on how an immunocompromised patient in a flair will react to coronavirus infection, beyond the general notion from our rheumatologist that putting your malfunctioning immune system into a state of dysfunction and then getting the virus that puts your immune system into a state of dysfunction is probably really bad.
source: my partner has rheumatoid arthritis and takes hydroxychloroquine, and we've been concerned about exactly this sort of thing happening for a while now.
Can't speak for the OP, but my background is exactly the same, self-taught web dev, started in 2007, worked for myself for a decade (no college degree, either). I understand where he's coming from. I'm also now a senior front-end engineer at a fortune 500 company, so I figured out how to transcend as a programmer.
As a self-taught, self-employed coder, I can tell you that you can get extraordinarily far without knowing a single fundamental. Just having a willingness to dig in and tinker will put you miles ahead of what most people are willing and able to do, and you can absolutely build a career out of a can-do attitude and google. Over time, however, what seems to end up happening is that many people in this situation (myself included) learn systems without learning technologies.
At my place of employ, we distinguish between Application Developers and Software Developers, and I think that's a thing a lot of large companies do. Both are valuable, but they're fundamentally different. You can have a very advanced Sitecore Application Developer who writes Sitecore code all day, but if you hand him an IDE and ask him to write up a CRUD, he'll flounder, whereas a software engineer can more or less invent something new on the fly from scratch, but isn't going to be particularly fluent in any given third-party system.
I think, overall, these long-term self-taught web developers hit a wall like they do because they're really application devs and not software engineers. Sometimes, it's actual apps (I'm looking at you, WordPress and Drupal), and sometimes it's libraries or frameworks like jQuery and Bootstrap. I certainly did. It's basically the same sort of fluency one often finds in bootcamp grads, just with much more practice.
I managed to get over the hump by basically starting over after 10 years, relearning the basic fundamentals of browser development and really mastering development with vanilla html, css and javascript, learning the fundamentals of node, local automation, and just enough computer science to be dangerous. With that, I took those skills and started working for other people again, as a developer on a team. I can't tell you how valuable it's been to simply have other human developers to compare myself to. This is what I learned:
1: There are a lot more shitty developers with CS degrees out there than I realized. However mediocre I may be, there are loads of my peers out there being just as bad.
2: You'll advance faster than your peers once you enter industry. Junior developers can usually program, but senior devs use old age and treachery to get shit done. You've done the hard part already by gaining experience -- now you just have to catch up with technology.
3: You'll find the software engineers around you are generally more specialized than you are. People who work for themselves for a long time get very good at doing everything all the time and learning new things quickly because clients are needy and think you're good with computers. This is a superpower. This is what tech leads are made out of.
I mean, look, I couldn't re-implement quicksort of you put a gun to my head, I've never written my own compiler, and my networking fundamentals are complete shite. But I can negotiate for the resources my team needs, I can bash a misbehaving legacy codebase in a language I don't know into submission fast, I can sniff out an overengineered solution at 30 paces, I can architect a front end, I can delete code at a furious pace, and I can sit on juniors until they start making simple solutions instead of elegant ones.
I'm 40. I've been doing this for 14 years, and I'm exactly where I should be in my career. That's enough for me.
In one case I've observed, the effort was _honest_. There was a real attempt to make IT have the same seat at the table as the rest of the business. That doesn't mean it was easy, and I wouldn't call it SMOOTH, but the transition itself wasn't that bad. Ironically I think if they had just 1 or 2 different people in the right places during the transition they would have done well enough for themselves I'd wish I was still working there.
In another observed case, well, the process did seem to shine a light on people who's primary duties at work were to make sure they still had a job... In that scenario it can get really ugly because the optics make it look like "We empowered IT and things got worse!" Of course ignoring how the gatekeepers would rather quietly ruin projects before giving another division a 'win'. Leads to a huge fear culture because you start to ask who's going to be the next sacrifice for the lack of real progress on anything.
Hello! Now you've met a female dev with no CS degree. There may be a reason we're thin on the ground, too. My "training" as a software dev consisted of 10 years of self-employment as a freelance web developer, and when I decided to transition to working for a company, I faced massive hiring discrimination. I couldn't pay someone to interview me! And I did demonstrate that it was discrimination, too. My resume was getting something like a 2% response rate with my name on it, but I put a man's name on it and sent out a bunch, and suddenly I was getting a response to more than 80% of them.
Anecdote is not data, and it's only my individual anecdote, but it's my experience that breaking through into the industry to be incredibly hard. Since then I've made up for lost time and advanced faster than a college grad would expect (junior dev in a shitty agency to enterprise lead dev in about four years), and I attribute that to spending so incredibly long as essentially a junior dev freelancer and just being older.
So it can be done, certainly, but I strongly suspect there are several filters working against self-taught developer women making that transition into the industry, and one of them is definitely gender discrimination. And no, aside from sending out resumes with a man's name on them, I don't know how to fix it.
I suspect you'd have had far easier success with a BSCS on your resume. I can't imagine trying to get into this field without having had a degree -- it is a hard industry to get into.
I keep hearing about gender discrimination, and maybe this is a market segment thing, but everywhere I've worked is actively recruiting female devs. Also the places I've worked as a dev typically require a CS degree though for what that's worth. Many of them even require an MSCS -- I've actually been turned down for one job because even though I have an MS it wasn't an MSCS (gov contracts get very specific).
Oh you're almost certainly right, it would have been easier with the degree. I don't have any degree, and programming is a second career for me, so in a world where 26 year old senior developers are a thing, it's just one more thing putting me on my back foot. My biggest regret is not starting down this path 10 or 15 years earlier, honestly.
I am absolutely willing to believe it's a market segment thing. I live in a smaller city full of colleges, so the handful of large companies hiring want all of their jr devs to be right out of college, and that's straight from the recruiter's mouths. I was forced to apply to all the small companies.
And, for what it's worth, my current job and the two before it "required" a degree. From what I can tell, that requirement is to filter people on the low end of years of experience. Old age and treachery counts for a lot more than you may think.
Requirements can certainly be waived -- a good candidate who can "network in" can often bypass gates like that if they're a known quantity to someone in the company. Personal recommendations do make a huge difference -- I've networked my way to nearly every job I've gotten in some way.
The only evidence I have is response rates, and about 12 companies that responded to a man's-name resume that did not respond to an identical resume with a woman's name. I have consulted with a labor lawyer, who told me that this might be actionable if I can get enough women together to also perform this experiment and gather together in a class-action lawsuit. I don't live in a particularly large city, and I believe it's almost impossible without astroturfing.
> responded to a man's-name resume that did not respond to an identical resume with a woman's name
If you can demonstrate evidence of this, you absolutely should, because that would be a particularly surprising finding considering how many organizations state upfront that they have preferential hiring for women. I’ve hired software developers for decades, and the resumes are dominated by Indian-sounding names: I don’t know about you, but I usually can’t tell whether those names are masculine or feminine. I’m shocked on the rare occasion I come across a resume with an American sounding name like Jim or Mary.
female law school graduates have outnumbered male law school graduates for a couple of years now. The future is here already, they're just in junior positions.
Of course it's valid. Look for patterns. If 100 Apache Helicopters sign up for your app, congratulations, you just uncovered a new and very specific marketing segment to target.
My personal favorite anecdote: I was out driving in the car with my partner. I said "hey, that guy is riding an electric unicycle", and then starting 4 hours later and lasting for 5 months, facebook showed me ads for electric unicycles.
Sure, we all have funny little anecdotes about times we talked about something and then saw ads for it later that day.
But as a technologist and software developer myself, I know just how difficult it would be to make a device that would be able to constantly transcribe the audio input stream and surreptitiously upload that data all day - and do so without destroying the battery life or having a noticeable network bandwidth footprint.
Not too mention the fact that if any memos/emails documenting any part of these "black-ops" advertising programs were leaked, it would be the tech story of the decade and probably result in billions of dollars of fines and legal fees for Facebook/Google/Apple/Amazon.
No, in the end, it seems much more likely to me that these occurrences are just the Baader-Meinhof phenomenon in effect.
After all, how many things have you talked about that DIDN'T end up in your advertising stream? I mean, this is pretty easy to test. Why don't you just start talking about adult diapers in front of your phone right now and see what happens?
And do you recall all the times when you said something and didn't see any ads for it? It sounds like you are letting confirmation bias rule your thoughts.