Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

A complete understanding if you ask me is one which requires no impenetrable "black-box abstractions" to be substituted in the abstraction hierarchy. In the case of a CPU, you would have to have a full gate level understanding of it and an understanding of how the gates operate.

To be honest, the "complete understanding" died way before the original post. It was the moment that they started building computers with integrated circuits inside them (before 1970) and complexity rocketed logarithmically. It died doubly so the moment that electrical engineering and the practical side of computer science diverged into "them" and "us".

If you want to use something which you understand, you will probably need to buy a crate of transistors, resistors and diodes and wirewrap yourself an entire computer from scratch PDP-7 style.

This fact is a warning: We really are building things with too deep abstraction hierarchies causing knowledge to be divided. One day we will never hope to comprehend anything in a lifetime.



Then blacksmiths in the Middle Ages did not 'understand' the forging of swords. Only modern materials science allowed us to 'understand' why forging creates harder metal.

This is a semantic discussion about the meaning of 'understanding': does it mean you can globally explain how the system works and could come to understand the smallest detail of every part? Or does it mean you understand the smallest detail of every part?

The latter is a nonsensical definition: if that is the case, then nobody understands processors, because nobody understands transistors, because nobody understands quantum mechanics, because nobody understands why the fundamental forces act in certain ways. Nobody understands Newton's laws, nobody understands where babies come from and nobody understands what it means to perform a 'computation'[1].

Of course, that means the former was also a nonsensical defintion.

[1] http://plato.stanford.edu/entries/church-turing/


> Then blacksmiths in the Middle Ages did not 'understand' the forging of swords. Only modern materials science allowed us to 'understand' why forging creates harder metal.

The interesting way to construe the article's claim is not that it's impossible to know everything, but that's impossible to know everything that people already know about the field you work in.

Were there blacksmiths who knew everything anyone knew about forging swords? Did Newton or Da Vinci know everything anyone knew about the various fields they were expert in? Are there farmers now who know everything anyone knows about how farming works? The article claims that at some point it became a certainty that programmers cannot know everything that anyone knows about how to use the tools they use and what those tools do. The stack is too complex. That's at least a sensible and interesting claim.


No blacksmith, farmer or famous scientist/artist/Renaissance man in history has ever known everything there was to know (about the field they worked in). Even back then, there was already more knowledge being produced than they could possibly ever obtain. As with us, the ultimate problem is a lack of time. Whether the time required to travel to a neighbouring city to learn from their guild or the time required to read a paper on the internet, the problem is time.

All stacks have always been too complex. A farmer, to produce optimally, has to understand everything there is to know about meteorology, biology, sociology, economics and earth sciences concerning his specific area. That has never been the case.

In that sense of 'understand', nobody has ever completely understood any system they worked with/in. Given a system and a person, you can come up with a legitimate detail question about the system that you also believe the person couldn't answer.


Semantically speaking, understanding is simply knowing enough to recreate something, preferably with your own aquired skills and knowledge. We're all just fancy parrots wrapped up in monkey bodies.

To be honest, the bedrock abstraction should stop at "what humans can realistically create with their own hands from nothing". You can make your own transistor quite easily and Ebers-Moll provides a nice set of rules to work with.

The quantum physicists and philosophers can remain arguing about technicalities then and let the rest of the world observe, understand and create.


> what humans can realistically create with their own hands from nothing

could medieval blacksmiths really create a sword from nothing but their own hands? how would they get the iron? would they have the knowledge locate veins of iron and then mine that iron? could they build the kiln and forge the equipment necessary for forging

arguably, by this standard, even farmers could not farm. farmers may know how to plant their crops, but without current crops to gather seeds from, would they know how to find the strains of plants that suit farming and then gather the seeds from those plants?


Well actually, I reckon yes. Much to my parents' latent terror (I didn't tell them until they came back from holiday), at the age of 15 a friend and myself built a small blast furnace using some ceramic pots, bits of stone lying around and a hoover. I managed to get 50g of what looked like pig iron out of that bugger before it basically fell to bits and set fire to the lawn. That was from about 2kg of ore I found at the bottom of a cliff face next to a beach at Skegness. Was great fun! (flux was limestone from the shed, coke was 3 large bags of barbeque charcoal)

I'm sure that and the rest of the process wasn't beyond people with a higher budget and requirements...

As for plants - it's all knowledge and experience. There is no abstraction. Eat this, don't eat that. I grow quite a few edible plants myself and there is little abstraction.


> You can make your own transistor quite easily

Not from scratch no. Not even with all the intermediate knowledge available but none of the tooling and technology. Unless by "quite easily" you mean "in under a dozen generations".



You're joking right? Taking apart an existing diode to build a transistor from it comes nowhere close to making the transistor from scratch. Not to mention the clamps and files, the microscope, the phosphorous bronze for the contacts, ...

This video is great, but it's a century or two of progress away from "from scratch".


I do all my "fun" computing these days on a BBC Micro, in assembly language. I don't have total understanding of the ICs, it's true. But in its 64k of addressable memory (32k RAM), I've a pretty good idea of where everything is and what happens when. Very satisfying.


Very cool. I still have the old Osborne 1 on which I learned to program.

Interesting historical point - the BBC Micro was designed by Acorn Computers, who are the parent of the ARM processors that are so ubiquitous today.


Indeed, the original ARM was designed on a BBC Micro + 6502 Co-Pro. Amazing what "real work" you can get done on one :-)

http://en.wikipedia.org/wiki/ARM_architecture#History


I was the proud owner of an ARM copro [1] many years ago (I still have the Master it was plugged into) and the first Acorn RISC machine (an A310 with 512k RAM if I remember correctly).

They were and still are extremely powerful and productive machines.

[1] http://en.wikipedia.org/wiki/BBC_Micro_expansion_unit#ARM_Ev...


Returning to your original post,

We really are building things with too deep abstraction hierarchies causing knowledge to be divided

Abstraction is necessary, true, but it's not clear to me what the abstraction-level we have now really gets us. In other words, say we had a BBC micro with a 2Ghz 6502 in it. What productive computing tasks that we do now could it not do? Or let's imagine an Atari ST with a 2Ghz 68000, to get us a little more memory. What could it not do, that we need to do now? I'm struggling to think of anything.


It doesn't get us anything at all other than a fucking huge rabbit hole to stare down every time you do anything. Lets look at a pretty naff case for .Net CLR on x86 for windows workflow:

application -> xaml -> framework -> server -> container -> c# -> cil -> bytecode -> x86 instructions -> microcode.

Now forth on a 68000:

Forth screen -> 68k instructions

To be honest, for what I consider to be life and death stuff, a 10MHz 68000 is good enough (I have one in my TI92 calculator).


The main thing is that you'd need a new set of abstractions for security, and then you'd need to implement HTML5 on it anyways to do all the things we can do on a computer now.


to do all the things we can do on a computer now

Such as what? 99% of websites are a) screens where you enter something into predefined fields, to be stored in a database and/or b) screens that format nicely for display things that you or other people have previously entered. They were doing that in the 1970s. Only the buzzwords change.


HTML5 is a piss poor evolutionary abstraction of what was effectively SGML and scripting carnage.

If you could start again, would you really end up with HTML5?


Any idea where someone (such as me) born too late to have an original Micro might be able to get hold of one?


If you're in the UK, there's a liquid market in BBCs, C64s and similar on eBay. There are a few sellers who refurb them (e.g. new power supplies, cleaned up cases etc). You can easily adapt a BBC to use SCART too (the lead will cost about a fiver) and use it on a modern TV, if you don't fancy using a big old CUB monitor.


I hope that the RasberryPi will be a re-run of the BBC Micro, they could do a lot worse than bundle it with a modern version of BBC basic, which was very advanced for its time.


Brandy Basic is pretty much that: http://sourceforge.net/projects/brandy/


Get a Cub monitor though - it's not the same without one!


I have two :-)


Cool - I bow before your Eliteness! (pun intended ;-)


Good for you!

I (the parent of your post) actually have a BBC Master (and the advanced reference manuals) lying around still for precisely that reason. It's quite a handy and very powerful little machine to be honest.

It even runs LISP (AcornSoft LISP).


Will the game be changed once again when 3D printers will be able to print circuits?


Not until the printers can make their own CPUs and complex parts such as threaded rods. That is a long time off.

3d printers are supposedly promoted as printing themselves i.e. as self-replicating. They are not. They print a small fraction of their own non-complex parts.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: