On the one hand, it's easy to dismiss any new proposal with the argument that "we've heard this 20 times before and it's never worked" and you'll usually be right.
But public clouds, for example, aren't really like histrical timesharing. Because the underlying tech, capabilities, and demand are so different that things really are different this time.
One of the big differences between now and then is network capacity. Once we've gone past basic text and into 3D, Photo, Video, Timesharing could not work effectively over the internet at the time, the bandwidth wasn't there yet. The only way to continue was to bring the hardware home. Now that we have the network capacity to handle almost anything, we're seeing things go back..
I am not sure why networking and could computing is in this thread. This is certainly the last thing you want to try or use (except for Internet access and all its learning resources) when learning to program. And it won't let you become as familiar with common computers logic as programming, say, a Tetris.
It depends on what your objective is and what you're trying to accomplish.
For a lot of people trying to just accomplish some specific goal, learning to program in C (as per the article) is probably not the best approach unless they're into OS kernels or embedded programming. Instead, they might well be better off stitching together some cloud services of various types. Not everyone has as an objective passing a leetcode whiteboarding interview at some ad tech company.
Yes, connectivity is a huge difference. On the other hand, as there's more and more data-intensive activity happening outside the data center, we're actually seeing a general trend away from everything happening "in the cloud" as was being promoted in the late 2000s. See e.g. Nick Carr's The Big Switch.
But public clouds, for example, aren't really like histrical timesharing. Because the underlying tech, capabilities, and demand are so different that things really are different this time.