Yes we do all kinds of tricks with context switching to make it seem like it is doing a whole bunch of things at the same time.
But if we had visualized the activity in human terms, it would be an assembly line that is constantly switching between cars, trucks, scooters and whatsnot at a simply astonishing rate.
Multicore processors literally run several things at the same time. Even a single core can literally run several instructions at the same time thanks to instruction level parallelism, in addition to reordering instructions, predicting and speculatively executing branches, etc. The processor also has a cache subsystem which is interacting with the memory subsystem on behalf of the code -- but this all works in parallel with the code. Memory operations are executed as asynchronously as possible in order to maximize performance.
What's more, outside a processor, what we call "a computer" is actually a collection of many interconnected systems all working in parallel. The northbridge and southbridge chips coordinate with the instructions running on the CPU, but they're not synchronously controlled by the CPU, which means they are legitimately doing other things at the same time as the CPU.
When you read something off disk, your CPU sends a command to an IO controller, which sends a command to a controller in the disk, which sends a command to a motor or to some flash chips. Eventually the disk controller gets the data you requested and the process goes back the other way. Disks have contained independent coprocessors for ages; "IDE" stands for "Integrated Drive Electronics", and logical block addressing (which requires on-disk smarts) has been standard since about 1996.
Some part of your graphics card is always busy outputting a video signal at (say) 60 FPS, even while some other part of your graphics card is working through a command queue to draw the next frame. Audio, Ethernet, wifi, Bluetooth, all likewise happen simultaneously, with their own specialized processors, configuration registers, and signaling mechanisms.
Computers do lots of things simultaneously. It's not an illusion caused by rapid context switching. Frankly, the illusion is that anything in the computer is linear :-)
Following up on that last thought, about how linearity is the illusion: this talk explains in detail the Herculean effort expended by hardware and language designers to create something comprehensible on top of the complexities of today's memory subsystems. It's three hours and focused on how it affects C++, but IMO it's well worth the time and accessible to anyone who has some idea of what happens at the machine code level.
Absolutely. You can get systems which don't perform the illusion for you, like e.g. the Tilera manycore processors, and they've not taken off because they're a pain to program.
Deep down the computer is still linear.
Yes we do all kinds of tricks with context switching to make it seem like it is doing a whole bunch of things at the same time.
But if we had visualized the activity in human terms, it would be an assembly line that is constantly switching between cars, trucks, scooters and whatsnot at a simply astonishing rate.