I am still in awe of NeXT's software technology, generally. It was just so carefully and intentionally designed as a coherent whole; one would hope this was where we were going as we got better at architecting software (as individuals, as a field), but disappointingly in retrospect it appears as a kind of high-point, after which we continued to descend into ball-and-twine mediocrity. For Reasons economic and and social that I think people could argue about a lot, but we don't because in part because as a field we don't seem to even agree on what excellence in software architecture/design even means anymore.
But what I want to talk about instead is:
> Like EOF, our database layer that still puts Ruby-on-Rails to shame.
I spent a couple years programming with EOF (the "Enteprise Object Framework", an ORM), and many more recent years programming with ActiveRecord. EOF had a few features that ActiveRecord still doesn't that I miss (like properly functioning multi-table inheritance; and lazy "eager loading" triggered on first access for all associations; Rails 6.1 has a welcome feature to RAISE on n+1 behavior, but why not just lazily trigger the efficient load instead, which is probably no harder to implement? Maybe nobody thought of it, having not used EOF?).
But I wouldn't actually say it still puts ActiveRecord "to shame". ActiveRecord is very similar to EOF in design, by 2020 nearly as mature, with 80-90% of the features.
Yeah, it's striking that ~20 years later we can say AR is mostly as good as EOF haha (and doesn't have anything of note that EOF didn't already have, it hasnt' superceded it in any ways). It's internal architecture isn't quite as elegant. But it really is nearly as good as EOF, it's deficiencies compared to EOF aren't large enough to be particularly shameful, in my experience/opinion, it's in the ballpark!
AR is so similar to EOF that I have always wondered if some of it's designers had experience with EOF.
As a former NeRD (NeXT Registered Developer) who started a company that did custom NeXT development, I both strongly agree and strongly disagree.
The technology really was great. Their understanding of object orientation was superior. The developer tools were wonderful. The user experience was generally a delight. We could develop custom software in a fraction of the time of people using the tools of the day. NeXT had a true vision of the future.
However, what they didn't have was much understanding of economics. The only reason that NeXT wasn't a complete commercial failure was that Apple's board wanted Steve Jobs back. If not, Apple might instead have bought out Be. And if Apple had succeeded in developing their own next-gen OS, both NeXT and Be might be minor footnotes these days. Even prior to the deus ex machina buyout, NeXT was on a slow and steady path to failure. They'd gone from an integrated hardware vendor to an OS-on-other-hardware vendor to a dev-tools-on-other-OSes vendor, and it's not clear that would have worked either. Once the acquisition was announced, they promised to take care of the people who had stuck with them and then did jack.
I took a few lessons away from my time with NeXT. 1) Just because I thought something was technically superior didn't mean it was commercially viable. 2) Being too far ahead of the market is worse than being behind it. 3) Never trust a "visionary leader" to look out for you, no matter what he says. He's in it for himself and the vision; the little people are expendable.
But you're definitely right that it made using other stuff painful. I stopped doing GUI development altogether rather than shift to Windows, which was incomparably awful by comparison.
I think the technical excellence was mostly a side issue there. What really mattered was that they made a better product. I'm sure there were companies that were just as technically excellent but not as focused on value delivery.
Technology always does for technology products. But my point is that pursuing technological excellence as product goal often leads to unsuccessful products.
Google could have -- and probably would have -- failed if not for Google Ads. The great search results drove people to use their product, but it didn't actually earn them money directly.
You can have the best product on the market and fail, and you can have a terrible one, yet succeed. Google has as much business acumen as they do technical chops, and that's why they are such a success. Same with MS and Apple.
Back in the day altavista was superior for search but it was a loss leader. It was designed to sell DEC. I cohabitated some space with Lycos but they couldn’t profit. I think Google executed really well, even to their surprise in hindsight.
I'm lost, you're saying because their product has a monetization strategy that it doesn't count somehow? By that logic, we've reduced the statement I'm replying to into the tautology "cool products with no monetization strategy aren't monetized". Doesn't have quite the same ring to it.
Edit: I guess if we want to limit it to cool technologies that are directly monetizable, the ford motor company of the early 20th century (or maybe Tesla) is a better example? It's just as silly a statement either way.
I'm just saying that technical excellence is orthogonal to how much money a product makes.
There is a ton of shit software out there that makes buckets of money, and lots of well designed, well executed products never make it off the ground. So you can't just focus on good engineering and expect clients to line up out there door.
NeXT was also mostly ahead of the market on the software side. Their machines were a very tough sell compared to the price and performance of other UNIX workstations of the time (which is why I know SunOS and not NeXTStep).
All the vision and all the software quality in the world won't make you competitive in the 90s UNIX workstation market if your machines are underpowered, and we were used to garbage software anyway. Chasing the "personal workstation"/PC market also would never work. DOS/Windows was far too strong and the Macintosh deep in a niche. It's very unfortunate.
NeXT failed on the hardware cost side because they wanted to be a personal computer and not a workstation. They were priced for neither market.
I looked very seriously at Unix machines around the time NeXT came out, having been converted to that religion in college. NeXT started at around US$6500, and that was with the optical disk only. The equivalent-ish Sun box (Sun 3/80) started at around US$15k with disk as I recall and went up in price really fast if you wanted more memory/disk/etc. About the cost of a new Honda Accord at the time. And the Sparcstations were out at much higher performance (and price...I seem to recall around US$22k for a usable config).
On the other hand, you could get a nicely decked out 386/33 for maybe half the cost of the NeXT, or a 486 for a grand or so more. And it ran tons of software, even if it was garbage. Even Unix.
The NeXT at launch was $6500 list for the base model. There were academic deals where you could get it for less, but that's not what we are talking about. And you could get a machine for $3k at least as fast in 1988; the 68030 was past it's prime. If by 'NeXTstation', you mean the 'Slab' pizzabox NeXT, that was $5k list for the mono version, released in 1990, and had a 68040. You could get a much faster machine for $3k in 1990.
The trade name for the original $6500 cube was the NeXTcube (68030). The names for the pizza box workstations were the NeXTstation (68040/25 MHz) and NeXTstation Turbo (68040/33 MHz). The NeXTstation spec’d out at 15 MIPS.
You could buy a NeXTstation for $4995 on the open market, and considerably less with an educational discount.
You could buy a Sun Sparcstation 1, which was released a bit earlier and ran a RISC 20 MHz processor but it was $9,000.
If I recall correctly, SGI workstations like the SGI Indigo at the time started at somewhere around $7500 and depending on configuration could be nearly $40k. The SGI Indy, which was the low end SGI machine, was faster and priced at an identical $4995 - but it wasn’t released until mid 1993.
The first Macs to use the 68040 weren’t released until mid 1992. And they also cost $7,000+.
I am unaware of a machine that was available in 1990 for under $5,000 that had more horsepower. If you can point me to one I’ll gladly concede. But at the time I was actually a NeXT campus consultant, which meant I was selling them and knew the specs of both the NeXT products and the major competitors, and I’m not aware of one. Certainly by 1992 the Mac and high end PCs were catching up, albeit both with vastly inferior operating systems.
That was a long time ago and I could well be wrong. But if so I want to see evidence.
The fact is NeXT machines were dogs. The OS was nice but the hardware was very underpowered. I have a slab in my retro collection. It looks pretty, but a Sun Sparc from the same era is much more powerful.
But to the same markets. NeXTs were being sold both to the workstation and to the personal computer market. They were cheap but underpowered for a workstation, making them not very good for worksation-ish things, because you couldn't scale up. For a personal computer, they were very expensive, so they didn't do well there either.
I’m not arguing they had a good business plan - clearly they did not. All I’m arguing is that they were very good computers for the price, both in terms of software and hardware.
Ah, a lovely machine. That was their second generation, when they were starting to get a sense of reality. Although Wikipedia has the introductory price at $4995, or nearly $10k in 2020 dollars.
Sure, but the Mac II fx was the high-end machine in a consumer line with plenty of low-end options. And the Sun boxes were workstations targeted at businesses and institutions, where high price is not a barrier if the business value is there.
The NeXT hardware was never really competitive in either market except certain niches. E.g., all our clients were in financial trading, because they were willing to pay a huge premium for rapid app development for financial traders.
Thanks for sharing. Can you elaborate a bit why GUI development for NeXT was (and probably is) superior comparing to Windows GUI development (even if we include Borland's effort).
At the time of NeXT’s heyday in the early ‘90s, most GUI programming was textual. You’d call add(button) and button.text = “Hello World” to build up your GUI, and have to wire up the events from your button to take specific actions. Quite a lot of GUI programming is still like this, even now.
What NeXT brought was a GUI editor that allowed you to drag a button from a palette and onto a window (or view). You could then change the text on the button by double clicking on it and renaming the default text. You also got to determine where and how large the button was in relation to the rest of the window.
Most GUI builders could do this, so what was special about Interface Builder?
Two things stood out. First, you could specify how the button reacted to window resizing. There was a “springs and struts” layout mechanism that allowed you to say which parts were fixed offsets and which were variable. You could also say if the button would resize, and if so, in the X or Y or both directions.
The second thing was the ability to connect the button to an action. By Ctrl clicking and dragging, you could wire up the default action to a “selector” — in effect, a virtual method call, on the owner of the button. This owner would be populated at startup, typically the application (controller). So you could have your code with the responder and another team build the UI, and they would join together at runtime.
You could also use properties generated by code as well - you could connect the button’s field to an object’s property (aka an outlet) so that changing the code changed the UI.
The fact that you could drag and drop connections from UI to code, and from code to UI, as well as building a responsive UI, was really what stood out.
This still lives on in Xcode today; IB and PB begat Xcode and IB which begat Xcode. The “nib” format - Next Interface Builder - was a binary format file containing the descriptive state of the Ui and the wiring requirements, which was renamed “xib” when XML became all the rage is the same thing. The fact that IB has been subsumed into Xcode still hides the fact that is what’s happening under the covers.
I think it’s important to realise that this was in an age when Windows 3.1 was all the rage, and we had only just got out of 256 colour VGA while Next station had 16 million.
Nowadays with everyone doing MVC programming with the web, it doesn’t seem so important. But then there was a time when no one wrote unit tests because it was seen as pointless; but it is from these seeds that ideas become mainstream.
Great description. Two things I'd add to that: NeXT's Smalltalk heritage, with what I think of as real object orientation, was great for UI programming. Objects on screen were actually objects that you could message. And Display Postscript made it much easier to get visually solid results.
I can't remember where I read this, but the basic notion was that the Mac was easy to use but hard to program, limiting its adoption. Jobs learned a lesson in that he wanted his NeXT machine to be easy to use and easy to program.
As you say, it might all seem a bit tatty now. The web has raised the game of interface creation quite a bit. But this was in the late 1980s, where a lot of what they did was revolutionary.
And the dithering was wicked fast. I remember playing back multiple videos on NextSTEP fully dithered and quite good looking where the same machine (dual boot setup Intel P90) had huge issues even playing back one video.
I’m sure that your aware that Ms access, Delphi, Visual Basic, progress as well as a host of other tools existed at the time, and you’re fine to point out that Next was superior but given that really none of these systems survived, something else must be going on.
Sure, but Delphi was released in 1995 as the first version, whereas this was something I was programming in 1992 (and I came late to the party with Nextstep 3).
Try using plain ES7+ (with async/await) JavaScript with Mithril (for defining components and their behaviors) and Tachyons (for Atomic CSS for styling). I like that combination best after having used Smalltalk and a variety of GUI builders (including Delphi and ones for Smalltalk and NewtonScript) and Angular and React. (TypeScript is OK too for bigger projects where documenting interfaces wins out over speed of development in plain JavaScript...)
And having dealt with GUI builders with special formats and coding implication related to objects sending special events, I'd much rather just write plain code in one language in a text editor than wrestle with a limited WYSIWYG tool.
Mithril's brilliance is assuming the UI is dirty if you have touched it in some way (mouse click, keystroke, etc.) and always rerendering after the event is handled (except if you want to optimize that). That leads to UI code which is much easier to reason about than arbitrary networks of dependencies like older UI toolkits emphasized. That style of UI development feels a lot more like, say, programming a continually-rerendering video game in for OpenGL than programming a dependency-based UI for VisualWorks/NeXTSTEP/Delphi/VB/etc..
More on all that by me: https://github.com/pdfernhout/choose-mithril
"tl;dr: Choose Mithril whenever you can for JavaScript UI development because Mithril is overall easier to use, understand, debug, refactor, and maintain than most other JavaScript-based UI systems. That ease of use is due to Mithril's design emphasis on appropriate simplicity – including by leveraging the power of JavaScript to define UIs instead of using an adhoc templating system. Mithril helps you focus on the essential complexity of UI development instead of making you struggle with the accidental complexity introduced by problematically-designed tools. Many popular tools emphasize ease-of-use through looking familiar in a few narrow situations instead of emphasizing overall end-to-end simplicity which -- after a short learning curve for Mithril -- leads to greater overall ease-of-use in most situations."
And I say that even having been an official NeXTSTEP developer once upon a time -- after I gave Steve Jobs my business card when I met him after he gave a talk at Princeton and he got me into the developer program (after my paperwork to join that developer program had previously apparently been ignored with its aspiration to build a system where any piece of data could be linked to any other piece of data). Even reading through all the glorious NeXT developer info, I never felt I could afford the NeXT hardware though as much as I wanted it (the short warranty gave me pause too) -- so my career as an independent software developer went in different directions. After reading the article and comments here, I can wish I had just thought to go work for NeXT instead of wanting to be a customer...
Personal experience: Around 2005 I was looking for a platform for a new web app, after some years out of development but having worked extensively with NeXTstep and EOF in the 90s.
After watching DHH's video and reading the Rails book, it reminded me so much of my previous experience with NeXT technology that I had no other choice but to go with Rails.
The dynamism of Ruby had a lot in common with ObjC's runtime. And reading about ActiveRecord at that time I also had the feeling that its authors had worked with EOF before.
All in all, NeXT built great stuff. I still own a NeXTstation Color that I got in 1992 (one of these days I should try to turn it on again). And it's a testament to the quality of that software that some pieces that I'm still running today, like Apple Mail, trace back almost directly to tools I started using back then (NeXTMail).
Yep, people don't often comment on how similar ruby and ObjC are, in fundamentals.
I think it's because both of them were so influenced by smalltalk, more than ObjC influencing ruby necessarily. But not sure.
But I'm still very curious if AR's creators knew EOF, yeah. I haven't found DHH mentioning it; not sure if there might be forgotten other person/people central to original AR architecture.
WebObjects itself was nice in many many ways (I think it's encapsulation of form handling is far better than anything anyone's managed in Rails)... but made a fundamental mistake in trying to keep a fundamentally stateful architecture and apply it to the web by putting what was effectively an opaque state ID in every single URL. This was a basically bad design for the web (although also provided for forementioned good encapsulation of form handling. :) ).
But yeah, the sense I get in my career is that we spend a lot of time trying to reinvent something that already existed, and getting close to being as good as it... then collectively moving on to the next language/platform and doing it again. With not a lot of progress. Up to and through the 90s, it seemed like there was actual progress in software design and architecture at the high-level, the level of affordances for developers to efficiently create reliable maintainable software, but it seems to me have stalled -- perhaps in favor of huge advances in more low-level stuff, better/different languages/language paradigms, etc.
> I am still in awe of NeXT's software technology, generally. It was just so carefully and intentionally designed as a coherent whole [...]
The closest I got to experience inner workings of NeXT software is observing the boot log of Mac OS (which you can see if you boot it with Qemu/Clover). I haven't seen so many triple exclamation marks in a while. That somehow didn't leave the impression of carefully and intentionally designed software.
I couldn't say how similar a 2020 MacOS bootlog is at this point to anything that was in NeXT, and wouldn't assume that whatever you're seeing now that you find inelegant was there in NeXTStep 20 years ago or longer. I mean, maybe, but I wouldn't just assume it and judge NeXT for it. ¯\_(ツ)_/¯
In any event, the boot log is not something I had occasion to pay attention to in NeXTStep, I couldn't speak to it.
NeXTStep/OpenStep had a great development environment and was full of innovation but even in the '90s it had old BSD components that were rarely updated and it really wasn't a great unix. Mac OS X has followed that pattern. Also Mach was inherently slow so running OpenStep on x86 hardware was slower than Linux or Windows - in Mac OS X they finally gave up on a pure microkernel and flattened the kernel to reduce the overhead of message passing through the BSD personality layer to Mach. But folks running OpenStep were running it for the RAD development tools and EOF that let you quickly design a UI with a very usable ORM that allowed you to take a desktop app and turn it into a webapp via WebObjects seamlessly. They complained about the *nix layer even then, but the unix layer was adequate and you could compile newer versions of tools you needed then as now.
After Apple bought NeXT, they upgraded the Mach component from 2.5 to 3.0 (from Apple’s MkLinux project). But it was always a hybrid kernel in both NEXTSTEP and macOS.
And I'm upset I didn't get an opportunity to properly work with WebObjects. WebObjects with Swift would revolutionize the web - IMHO - it was gone too soon.
I coded fulltime in WebObjects from 2006-2008 making webApps in the health care industry.
During my Software Engineering degree I learned the difference between a Library and a Framework, but it wasn't until actually using the WebObjects Framework that the light bulb went off in my head. It was a pleasure to work with, and clearly very, VERY well thought out.
EOF was great, and every time I made a new NSArray() it brought a smile to my face.
It would hardly do that, in case you aren't aware they were in the genesis of J2EE.
> Since the transition of WebObjects to Java in 2000, the functionality of many of Apple's Java Foundation classes is replicated in Sun's own JDK. However, they persist largely for reasons of backwards-compatibility and developers are free to use whichever frameworks they prefer.
When I was an intern at Apple, I somehow finagled my way into some long-time manager’s backyard cookout. A lot of Apple old-timers were there, including Blaine. Really neat guy and a great raconteur. That’s when I realized engineers of that era were cut from a different cloth.
Yep, you really needed to know the intimate inner workings of the CPU and each digital chip from the keyboard input to the display output to produce code that would make a quality product.
It took years of experience that you could only gain through hands-on work, to know how to debug hardware that's why gray-beards are so valued in hardware vs in software where people talk of ageism.
When your hardware/low level software doesn't perform as expected you can't google/stack overflow yourself out of the problem, you need to grab the datasheets, the schematics, an oscilloscope, a soldering iron, hunch over patiently and devise a way to debug the issue out as no one else can help you.
Hardware engineering is now just as challenging as it was back then but due to the commodization of hardware along with the rise of China and the downfall of high-tech giants like IBM, Philips, Siemens, Motorola, Nokia, Blackberry, Nortel, Ericsson, etc most hardware jobs disappeared or moved overseas and pay went significantly downhill compared to software engineering(at least in Europe).
I took a Microprocessors course in college (University of Florida) where in the lab you build up an entire board from scratch...including soldering each chip and component. You need to essentially program the drivers from scratch, using assembly...at the end your board has inputs and outputs like a speaker, switches, keypad, infrared remote control...everything was in assembly and you need to RTFM to know how each chip...pin out, clocks etc work.
It gives you a great knowledge and appreciate of what goes on beneath the covers. It was a grueling, but satisfying class.
Trade school? What kind of college did you attend? Most CS curriculums I've seen are full of math (science), computer science (compilers, algorithms, etc) and engineering (computer architecture, etc).
It's a mixture of science and engineering, leaning towards the science. Far far away from a trade school which would teach you practical skills (java, git, web development) and try to get you into the labor force ASAP
You'll find Software Engineering degree programs are often connected to schools of EE, CS, or more generic Engineering if you dig into this more. And of course things sometimes work in a different way - I think maybe MIT folds EE, CS and SE all together into the same department, possibly even the same degree.
I believe ABET recently deprecated their accreditation of undergraduate programs, and I don't have any idea what the implications of that are for SE licensure, but there's a possibly out-of-date list of ABET-accredited SE undergrad programs here:
> You'll find google a useful tool for locating software engineering programs, but okay
Not the question asked, but the quality of the answer is about what I expected (based on what I suspected). Glad not to have made an upfront overinvestment taking the discussion too seriously.
What question did you ask that you feel went unanswered? You spent more words crafting a wordy disparagement of the reply than you would have spent asking for a clarification or making an actual contribution.
And that would be the difference between a CS curriculum and a Software Engineering course that someone might attend anywhere other than a Top 100 university. They are discrete curricula and the criteria to get on to SE tend to set a lower bar.
That's pretty interesting. I wonder if sitting for the PE exam were a legal requirement for calling oneself a "software engineer" in the United States, they'd raise the bar for admittance to those programs. (and honestly, improve the SE curriculum somewhat) As it stands, the trend towards licensing software engineers seems to have reversed itself recently. [1]
I was discussing this with my dad earlier today. Neither he nor I have a degree. He started a course in the 70s that he never finished, and I didn’t apply to university, for numerous reasons, but I was application age when fees were tripled to £9k, and the loan repayment structuring made it apparent that 80% of these degrees weren’t worth the money.
My current role title is Software Engineer, but I don’t go by it outside work very often because Engineer means something outside the technobubble. I have a friend with two degrees in aeronautical engineering, and he clenches every time he hears someone refer to me as an engineer. I agree with him too, but earning that title should be about more than a degree, engineers in industry almost always have to work under a supervising engineer for a significant period of time before acceding to the title themselves.
Could you elaborate on what it means to you to be "educated as engineers"? What qualities or learnings are stressed in that model that you think current CS education misses?
Not the OP, but Computer Science is a scientific discipline not an engineering discipline and is generally taught as such. Unless you take specific Software Engineering topics you're unlikely to learn things like unit testing, build and integration tools, how to use a debugger effectively beyond maybe a brief introduction at best, etc.
If you were being taught software engineering as an engineering discipline, these should be absolute bread and butter core components of the course.
Not at all, in Portugal we don't have Computer Science as it often discussed around here, rather Informatics Engineering, with certified professional Engineering titles.
If you want computing theory without programming, you do a math degree with specialization in computing theory.
Plenty of other countries follow similar practices.
I don't see how you are disagreeing with me. It sounds from your description that your Informatics Engineering courses really are more engineering oriented than our Computer Science courses, which is what I would expect.
I guess that depends on the school offering the curriculum. I went to school as a Computer Science Engineering major. It wasn't EE, but it was way more than how not to write an infinite loop. It was billed as "you could design/build your own computer, and then write the software for it" type of path.
This got me to wondering what an interview about "the actual practice of writing software" would look like, and having a look at Glassdoor's mechanical engineering interview questions [1] for comparison, it doesn't seem these kinds of questions would elicit much better quality candidates.
I'm increasingly convinced that the apprenticeship program approach would yield far better, deeper results than how we're going about recruiting these days, but most business leadership is fiscally addicted to short-term hire-fire cycles instead of looking for ways to exert more control over their destinies. I suspect that recruiting model is an ingredient to systematized innovation (the "deeper" part I mentioned, which I use to denote internalized concepts, procedures, mental models, etc. necessary to fluent application and craft that I believe are absolute table stakes in innovation).
I think I can answer this, because I studied "Software Engineering" which was a 4 year degree (with honours), accredited by the Australian Institute of Engineers. A bunch of friends at my University studied "Computer Science", a 3 year degree (no honours option).
Early on our subjects had a lot of overlap - Java Programming 1, Database 1, Networking basics (TCP), etc. etc. I also had a healthy dose of Eng. Maths, and electrical eng. theory.
As the years rolled on I had subjects like Eng Ehtics, real time programming, software maintenance project, OS design, large scale system design, and I took a digital electronics minor (could have chosen game programming or OS design). Towards the end I did a final year engineering project which was a full year project with 16 team members, working for a real client, building a massive piece of software from start to finish.
So my degree was a whole lot of Eng. stuff wrapped around CS. When finished my friends could sling code and make software. I could do that, but I could also design a bunch of stuff, and had a lot more theory to back everything up.
At a very high level, I'd say Computer Science is more about "just do the work" like a welder on a bridge or a mechanic on an F1 race team. Where-as Software Engineering is people who designed/tested/validated the bridge, or the engineers who actually designed the F1 engine/chassis/aerodynamics.
I see the difference play out in the workplace all the time when CS grads (or people with no tertiary education in Software) want to get right into slinging code, and they get totally lost building so many trees until the forest is really thick. Software Engineering folks step back and actually design the forest before they build trees. This also plays out when trying to fix stuff, or even root cause analysis. Doers want to get on with doing (welding, changing spark plugs) while Engineers want to get on with understanding what needs to be done, then figuring out how best to do it.
I think it means thinking a problem through and actually "proving" or at least "estimating" that your solution works before you set off to design and build.
a) I worked in the industry for 20+ years before I took a single algos course, and so do many other people
b) Algos is only a small part of it. It is not just individual bits of code, but the whole system you're designing, all the moving pieces, how they connect together, network effects, etc.
The University of Victoria (UVic) in Canada has a CS[1] program and an accredited BSEng[2] program, which might provide a useful comparison on an engineer’s vs a scientist’s education.
Disclaimer: I am not a graduate of either or those UVic programs or a CS program, and I am not an engineer (I graduated from a mechanical engineering technology program), so my comments should definitely be taken with a bunch of salt.
I would say there are three main differences:
Engineering specific:
1. All engineering programs share a common engineering core. Physics, chemistry, material science, math (usually these cover the material from an engineering perspective and are different than courses in the faculty of science), drafting/engineering graphics, engineering economics, and possibly some other courses.
2. An emphasis on professional responsibility and ethics. For example, what happens if a civil engineer certifies a bridge and it later falls over (there is a course where you learn this!)? What happens if a developer builds a website that leaks customer credit cards information (I genuinely don’t know)?
Software specific:
3. A greater emphasis on systems and systems design, testing, projects, and few electives leading to a standardized curriculum providing a more consistent knowledge base across graduates.
Comments on engineering in Canada:
Graduates of the BSEng program can become registered as a Professional Engineer (PENG) in Canada. To call yourself an engineer in Canada requires that you are licensed (PENG) or in the process of gaining your license (engineer in training / EIT). In Canada, to be “educated as an engineer” would, to a first approximation, require someone have a BEng, BSEng, or BASc degree from an accredited engineering program[3]. I am sure there other accredited engineering degrees that I am not familiar with. Only having a BSEng means you are a graduate of an accredited software engineering program in Canada, you aren’t an engineer yet.
Professional Engineering registration requires a 4 year accredited engineering degree, 4 or 5 years of work experience. Some of the work experience must be supervised by PENGs. You must pass professional exams and keep up with professional development. Additional details can be found at [4].
Maybe we met there too? Honestly, one of the top 5 parties I've ever been to in my entire life in terms of intellectual firepower. I hope I get to go again someday.
One of my first experiences with Unix was getting an angry email from Steve Jobs.
He had a default message in the NeXT mail client back in the early '90's. I for some reason felt it was a good idea to send him an email and enable 'return receipt'. He replied, fuming at the violation of his privacy and never answered my question.
That is perfectly, perfectly believable. He was never interested in being accountable to others. As Steve Wozniak said, "He had very, very, very negative sides and he didn't seem to care what other people felt."
Actually a lot of success people are like that. For successful careers you got to be comfortable stepping on others' toes and persuading others to do things in your way. Human nature. Most of us are just herds who secretly want to be led by a strong, charismatic leader.
I strongly disagree that this is some essential property of humankind. I think what you're describing is the learned helplessness [1] associated with trauma and abuse created by people like this. Is it common? Yes. Is it something we should shrug at? Fuck no.
I think far too few of us are given avenues to see our own competence & capabilities. Consumerism & the media & school each dilutes the genuine locus of control that lies within. Fear of losing shelter & health care & food drives our ability to exercise our "man, the tool maker" spirit, our willingness to venture forward.
Praising the wolves who don't seem to care, as somehow the rightful tenders of the herd, is not how I see things.
Top level executives very often lean to psychopthic tendencies than your average worker bees at corporations. https://www.telegraph.co.uk/news/2016/09/13/1-in-5-ceos-are-... . You have to be a cold person to do a lot of what they do or at the very least have a very strong conviction that what you're doing is for some greater good (good chance it's yourself). I also suspect that ratio increases the larger the corporation.
I don't think there's more psychopathic tendencies in those positions because it's a necessity to being a successful leader.
I think a lot of people crave power, and with a disregard for morals and other people in general, it's easier to attain it. Good people also make good leaders, they just need more luck and hard work to get there without kicking on everyone across their path.
I think another way to put it is that organizations created by sociopaths tend to be set up for sociopaths. There is also the possibility we could create organizations set up for other kinds of people to succeed.
> There is also the possibility we could create organizations set up for other kinds of people to succeed.
“My exciting insight came in 2001 when I read Bernard Lietaer’s book, The Future of Money. It helped me see the ways our monetary system has shaped our culture and even our sense of humanity. That’s when I realized the things we had invented (in my company) were in fact new kinds of currencies, and that currencies are the main tools we use to shape patterns in community and culture. They are the DNA of our social organisms.”
Another way to put it is that once you have one sociapth/whatever setting up an organization/country/community and won a few battles the infection quickly spreads and eventually people secretly accept that only a sociopath/whatever looks like a good leader (so that their interests can be protected and their conscience untouched).
What you miss here is that "great" leaders are often great at creating harm for those who don't follow them. The risk/reward ratio is artificially skewed.
I love reading stories about Steve Jobs at NeXT. He had been fired from Apple and wasn't on the winning team but he was still fighting to build great products. I know he's a controversial figure but he did great things.
Personally I find that Steve at NeXT is far more relatable than post iPhone Steve.
Heard a lot of story and this about him talk to the story guy about his wife passing away. Can’t imagine it happened in Steve I. It is cruel to say the best thing happen to Beethoven is his deaf. And Steve his being fired. But life service you lemon and sometimes it is the good thing. Less arrogance as he was quoted to say.
Probably just not English-as-a-first-language. GPT3 will typically have great grammar/spelling but nonsense meaning, but if you allow for slightly odd or near-correct word choice, the gp comment is pretty clear.
Wow, 81 comments and not one mention yet of the NeXTcube. That perfect 305mm x 305mm x 305mm (1 foot x 1 foot x 1 foot) magnesium cube was hell to produce, but gorgeous to look at. And, it was the machine that the original web server ran on at CERN. [0]
I've kind of always wanted one of them, but I've also kind of wondered if I'd be disappointed by it, if I got one. After all, it only ran at 25mhz.
What I'd really like to do is get an empty case and put a modern PC inside it. That would be awesome. You'd probably have to gut the case and put in new mounting hardware, but a mini-ATX or micro-ATX board would definitely fit in there. There should be room for drive rails and a PSU, but I wonder if ventilation would be a problem.
Perhaps the most fitting thing to stuff in there would be a Mac, or, maybe, a Hackintosh.
I’m not a fan of taking old computers and “harvesting“ their cases. People used to do this a lot with the original Macintosh models. Often they would turn them into aquariums. Now all those computers are collectors’ items, and many of them have been butchered. The original NeXT computers are much more rare than those Macintoshes. In my opinion they deserve to be preserved for history.
That's a good thought. I wonder if such a case mod as I propose could be done non-destructively. That is, if you were done with the modern PC inside, could you easily re-convert it back to a NeXTcube? I have no real idea, since I've never seen inside of one. :/
I worked in academia, and there were a few old NeXT machines around. I said, I'll collect these! None of them worked, but I figured my little office had enough space to give them a new home. A few other people had them, saw I was keen, and gave me theirs. I ended up with a cube and two slabs. Never got them to work.
Found out that another colleague had a few of them in a storage closet, and had aspirations of cobbling them together to make a working one. I gave him the machines.
He went on vacation, someone cleared out the closet, and fucking recycled them all. When he came back he was livid. I still feel the loss.
Software guy going hardware here, throwing down on machining hardware here in China next week. Funnily enough, I actually saw magnesium being machined for the first time on Monday: a part for a medical device. The machinist said it requires different coolant (white in his case) but sources online say you can go dry as well. You also need different fire suppression systems in order to safely machine it, as the chips may burn from friction on blunt tools or excessive feed rates. I wondered why bother, so I just looked it up, assuming in the NeXTcube case it was aesthetics. It does not match aluminium in terms of its thermal conductivity, but allegedly may be one-third lighter, more dent-resistant, more easily machinable, and better able to shield electromagnetic radiation and dampen vibrations. Seems a lot to pay for some nominal benefits unless specialist applications demand it. PS. Never saw a NeXTcube except in a museum maybe, but earned some of my first Unix software money programming embedded cryptographic applications for the abortive Cobalt Qube and Raq ecosystem, incidentally the only MIPS target I've ever written for, but no doubt NeXTcube-inspired.
I dunno, just seems fetishistic & insular. Physical product design is not why I personally got into computers. Quite the opposite, the liberation of feeling like you had joined a plane of of thoughts & ideas, decoupled yourself from the material. Apple still kicks out ultrapowered trash cans & cheese graters, & while sometimes the density is impressive, the showmanship of it has always been off-putting & encouraging bad-think to me, takes away g distracts from far more important realities.
This seems kind of pedantic, since the "NeXT Computer" and the "Cube" where both cubes. They needed to change the name after the slab (NeXTstation) was released.
Yes, I understand they are different models, but informally both are referred to as Cubes (obviously due to the shape of the case.) In fact, if you look on ebay, the first "NeXT Cube" I found was actually a "NeXT Computer." I believe they may both actually have the same N1000 model number, despite CPU differences.
Would Steve Jobs’ have been tolerated these days, in a post #MeToo era?
Now, I am definitely not accusing him of sexual harassment. But hand-in-hand with that, the culture seems to have shifted towards pressuring bosses of public companies and organizations to be less abusive in a range of domains. Would his behavior as been as tolerated or celebrated if he was still around today?
These people still exist. They run some of the major tech companies that produce products tech people love.
The difference is that top engineers have more options these days. They can choose to move into a high paying job at Google or Facebook where they don't have to deal with abusive relationships with the CEO.
Instead, companies with abusive CEOs attract people with high ambitions who don't yet have the skills and resume to walk into an easier, high-paying job. The CEO (ab)uses the ambitious, early-career people to extract as much work as possible before they burn out. The employees use the grind to level up their skills and resume to pivot into a better job later.
I worked for one such company early in my career. Turnover was high. It was basically a pipeline that either led to burnout or a cushy, high-paying job elsewhere if you could survive the abuse long enough to get an impressive resume out of it.
The catch is that none of us wanted to talk about how terrible the working environment was, because it would only devalue those lines on our resume. So instead we kept quiet and let everyone assume the famous tech company and CEO we worked for were actually amazing places to work. Anything else would be self-sabotage. It's a strange cycle.
I have a good friend who worked there during the 90s and wrote a shitload of the backend ordering system. She went on to work at Google as Director of Site Reliability Engineering.
Abusive/abrasive bosses are still celebrated today. Jeff Bezos runs a company where employees urinate in bottles because they aren't given time for a bathroom break and asks employees in meetings "why are you wasting my life". Tim Bray has some stories to tell about AWS too. Elon Musk abuses his employees, his shareholders, his companies, and everyone else on Twitter nearly every time he opens his mouth. It's almost cheating to mention Elizabeth Holmes. Same with Travis Kalanick.
I think Jobs would be thought of exactly the same if he were around and in his prime today: a very controversial figure who produces amazing work but has his fair share of detractors for a number of reasons. Remember, Steve's behavior was barely tolerated by a large number of people. He was hated by many, loved by many, merely tolerated by most.
It is interesting to note though that despite all his faults Jobs had very long, sometimes decades long, extremely fruitful work relationships. Woz, Andy Herztfeld, Joanna Hoffman, Avie Tevanian, Bertrand Serlet, Phil Schiller, Jony Ive... etc. And at Pixar too. Such high caliber people wouldn't stay around if it was so terrible or there was no redeeming quality.
The linked post by Blaine Garst is _glowing_ proudly of having worked with Steve Jobs and the all star team he assembled. Quote: "great minds collaborating and challenging each other to succeed. With the best CEO on the planet."
The "challenging each other" maybe the important point. If you are a normal dude it is easy being intimidated by a big ego. But if you are an A-player you can hold your ground?
Maybe it was even the case that engineers, who are focused on objective technical details/goals and having a thick skin, dealt best with Jobs?
He did mellow with age, though. The Steve Jobs biography actually latched onto this as a key narrative element—a way to construct Jobs's personal arch—and I do believe it's genuine based on everything else I've read about the guy.
And it's notable that Jobs only really reached his zenith in these later years. The original Macintosh had a splashy launch, but sales began dwindling pretty quickly[1], and NeXT never had much commercial success before Apple bought them. My admiration of Jobs is really for the person he was in his last decade. He was a visionary long before that, of course, but ideas are relatively cheap, and Jobs couldn't execute.
Jobs was, to be sure, certainly still a demanding figure at the end of his life (and I would not have wanted to work for him), but I think Elon Musk and Jeff Bezos have him beat.
Even ignoring the specific movement, hopefully we all pressure our bosses to be less abusive (no abuse is acceptable).
I could never work for Steve Jobs because I wouldn’t have put up with his ridiculous behavior and would have walked.
Part of the situation that lead to the MeToo movement was power, and Steve Jobs had a lot of power over people who worked for him.
This was something I knew since starting my career, and worked for the last ten years to make sure no one (other than governments) has so much power over me that I have to listen to them.
Isn't part of it that Steve was able to sell people on his vision though? So it's not just that he had power in the way that a judge has power or a school principal has power— those are powerful figures that you submit to because the alternative is punishment. Rather, he had power in the way that a beloved family member has power. People wanted to please him because they had bought into what the vision was and how their piece of the puzzle fit into making it a reality.
Was there abusive stuff going on there? Absolutely! And there's almost certainly some overlap here with other cases (actress submits to famous film executive because it's part of his "creative process"), but I don't know if the current/recent reckoning would do much to prevent a small, dedicated technical team from overworking themselves and tolerating abusive management practices in service of a new charismatic, visionary leader like Jobs apparently was.
Can you list some examples of these “lesser and lesser” offenses?
I’m a manager, I treat my employees with respect, and no one has ever complained about me abusing them. I’d like to know how “small” these claims are getting.
I can give you an example that I witnessed back when we were still in the office pre-COVID.
Someone was making copies at the copy machine. Another person made a joke comment about him running off copies of his resume. A harmless remark that's been made millions of times in thousands of offices for as long as copy machines have existed.
The next day the commenter got hauled into HR for "harassment."
That... is a very good example. Thanks for making it real.
At my last in-office role, I had employees (direct reports) give me similar comments if I happened to come into the office dressed particularly nicely. Certainly didn't feel like harassment!
When I’ve seen HR complaints in the past that seemed trivial, there was usually a history of interactions that resulted in the complaint, and the person filing has reached their limit.
That’s why HR is important - they need to determine if the complaint or history of complaints is a real issue or trivial.
I could imagine having to demo on Saturday and needing to work all day to incorporate feedback by Sunday would be called abusive, even at small startups, these days.
If that’s not something the employee agreed to up front and they aren’t compensated for it, it may very well be abuse of the employer-employee power dynamic.
Asking an employee to suddenly work all weekend when they don’t have the expectation and potentially aren’t in a situation to say no would certainly be considered an abuse of power (what if they miss their kid’s birthday).
I coach all of my employees that they own their time. I can’t ask them to work late or work more days, because I don’t own them.
It is my job as a leader to ensure that their time is protected, and I’ve pushed back on management multiple times when last minute changes were requested and my team would need to work more to fill that request. I put myself in the line of fire and say that I don’t have the capacity in my team to fulfill that without cutting work.
I ask my team to tell me if I ever overstep and they feel uncomfortable saying no when they really want to.
I also encourage all of my employees to interview outside the team/company so they know their worth and understand that they have the ability to leave if they ever feel our power dynamic is being abused and I don’t do anything to fix it.
Leaders can effectively manage teams and deliver on vision without abusing the employer-employee power dynamic. It makes leadership more difficult since you have less flexibility in the capacity of your team (capped at 40 hrs/week and can’t suddenly expand to 80 hrs/week), but it makes for better teams and happier people.
> Would his behavior as been as tolerated or celebrated if he was still around today?
Not sure I've seen many (any?) instances of his abusive behavior being celebrated in my 30 years of following him. Certainly some awe over how scary he was.
I've often thought it amazing that he was as successful as he was despite his terrible behavior.
His golden aura would have been dented for sure. Like a lot of abusive people, he did very well when he could control the flow of information. But social media is undermining that.
However, I think it depends a lot on where in his career arc this transition happened. If he had been caught out early on, it could well have kept him from rising. Imagine the Twitter furor if a rising exec got caught cheating his business partner, for example. [1] Of course, it could have gone the other way; his conscious manipulation of his image [2] could have led him to be less abusive, or at least better at concealing it.
But if it came later, once he was head of Apple, I doubt it would have mattered much. He was already notoriously an asshole. [3] People will accept a lot as long as the money keeps rolling in and the asshole seems irreplaceable.
#MeToo concerns sexual abuse in the workplace. I'm not sure why you thought it was necessary to cite that as your milestone marker, and then back out to talking about abuse in general.
#MeToo also had that disturbing element of liberal, feminist icons being the sexual abusers all along. (Harvey Weinstein)
I think the parent is using #MeToo as a catch-all for intolerance of any alleged abuse of power and cancel culture in general. And as others have mentioned Jeff Bezos and Elon Musk are doing fine, so no Steve Jobs would have probably been fine in the current time.
Some people thought so because he made a lot of movies about women and supposedly helped some very famous women with their careers. It was even used in part of his public perception campaign that he deserved some credit for helping these women.
I can see in the article you linked that Weinstein is using said claims in his public perception campaign, but I can't find any other resources about the "some people" part. Maybe I'm not in the know about the film industry, but I'm not seeing a lot of consensus that he was a paragon of feminism. Or, as far as high-profile feminists go, I wouldn't suspect him.
There definitely would have been pressure to change. I highly doubt he would ever have been cancelled, though. Nothing he did ever rose to anywhere near that kind of level, except for possibly aspects of his family life. Jobs was brilliant product guy, but verbal abuse never helps teams become more productive. All that he did was in spite of his temper.
I would, uh, say Musk is not doing fine. Among other things, he got his company investigated by the FCC for basically no reason at all. Jobs had his moments but when he went crazy, he didn't go nearly as crazy as Musk, at least not in public.
To be fair, Jobs also didn't have a Twitter account. I don't know why, but that seems to do weird things to people.
Musk is probably worse than Jobs, and he has fairly well-known talent retention problem ascribed directly to his personal behaviour, but both Tesla and SpaceX are doing exceptionally well.
Pavel Durov tho... he's visionary but his expectations became too unrealistic lately. I wonder what happens to Telegram now that the SEC stopped that ICO.
The real question isn't if he could have lasted or had to adapt. The real question is if the next Steve Jobs will be able to do the things the old one did, while avoiding post-progressive pitfalls.
Elon is different. People called and still call Jobs a jerk (as James Gosling in his interview with Lex Fridman). Elon shows erratic behavior from time to time, but I do not think people would call him a jerk.
I think his transphobic and downplaying covid tweets got him called a jerk. Oh and the pedophile accusation against that guy that rescued the kids from the cave. I think he was a jerk for that one.
>> One thing that was unusual is that all the technical people there understood all aspects of the machine. Software people could talk about ASICs and CPU instructions, and the hardware people understood the software stack. Every aspect of what it takes to make a computer work was represented in one building: analog hardware, chip design, motherboard design, compiler design (objective C), loader, operating system, windowing system, application layer, and applications. Where other companies had engineering teams, NeXT would have a single individual.
This is in stark contrast to most of today's companies, where you have front-end engineers who don't know anything about the backend they are interacting with, backend engineers who don't care about the frontend they are serving data to, database engineers who care about neither, etc.
And that's just software. The hardware might as well be a black box for the vast majority of software engineers working at your average software company today.
Do you think that engineers designing the plumbing system of the F1 rocket engine knew pretty much anything about "compiler design" or "motherboard design"?
What you are in fact observing is a human system's tendency to adapt to growing complexity. Human systems adapt to growing complexity by specializing it's members to particular skills (see https://en.wikipedia.org/wiki/Cognitive_specialization).
It's just a fact that way back then in the NeXT days, the computer was not as complex, the understanding of which was just about achievable by 1 human. Eventually, that metric exceeds 1.
For the case of the first apple computer, it was ~1 (Wozniak), in the case of Apollo, it was >100.
Cognitive Specialization is an aspect of all living things, but is very apparent in the Human species. Computers got more complex in just the same fashion as how farming went from a farmer, a bull, and a blacksmith, to gigantic conglomerates to make the fertilizer, tractors, watering system, etc., that comprise of >10,000 humans.
Partly ad tech. It changed the reward profile away from inventing cool new stuff towards lowest-common-denominator monetisation.
The real change was the change in the culture of computer use from original creation to consumption and distribution of certain limited kinds of creation - which are mostly imitative, nostalgic, and either backwards- or (at best) sideways-looking rather than genuinely original.
Real invention is now actively disfavoured. Google did a fair amount in the 00s but has slowly abandoned most of it, Amazon does a bit of blue sky but is mostly focussed on consumerism, Musk's idea of blue sky is straight out of a 1950s Tom Swift novel, and Facebook and Twitter are both hopeless. Netflix is cable TV done right - finally. But it's still cable TV.
There are some exceptions at Apple, which still has a kind of legacy tradition of doing cool new stuff (see also, M1) but even that is a mix of invention for the sake of it and strategic lock-in as a goal.
The result is a landscape full of development geared to comfortable suburban consumerism and associated corporate bureaucracy. There's very little interest in game changing technical development for the sake of it - which was more or less what NeXT was about. And there's even less interest in computing as subversion and empowerment, which was - believe it or not - a big interest in the 70s.
FOSS doesn't change this. (It likes to believe it does, but practically it really doesn't.)
Quantum computing and AI may be on the cusp - but even if they do something interesting they're going to be coopted by ad tech as soon as the paint dries.
So it's not about technical scope so much as imagination failure. The real loss is the loss of imagination - something that tech and media have both done a lot of damage to over the last couple of decades.
I don't think we did; there's a bit of selection bias at play for NeXT where they had a large network of stars to choose from and a reputation that would attract a large pool of other people worth choosing. It's just that a lot more people are involved in the industry now so there's a lot more entry-level/ grunt work to be had.
I think part of the problem is you comparing to a fairly different industry. Companies that employ "Front End" and "Back end" engineers are very different than NeXT, which had products from asic to high level.
If you work at a hardware company, even in a role that's very far from hardware, you become aware of this stuff because it effects you. Even if you don't understand ASIC at all, you still know about, oh, such and such process node has this issue, because it impacted our schedule and somebody told me about it at a lunch table. And you may not even know how to solder, but you can say, oh this needs rework, 0 ohm resistor at point such and such to make the display work, because you need to know it to go to the lab and have the work done.
Really the thing I understood the least working at a company like that was actual productization. You hear about something you worked on a year and a half ago being a tablet, or an embedded device, and you'd be like, oh, that old thing is just being released now? And it's in that form factor? huh.
By getting it right, which saw an explosion in the market, leaving us in a position now where there is more work to be done than the people who understand systems from top to bottom can handle alone. If those pioneering efforts had failed, the tech industry would now be insignificant and those superstars would be struggling to find work, never mind those who have a lesser understanding/care.
> NeXT was like graduate school, bringing together a high concentration of some of the brightest and most innovative technical minds
This line really interests me. As someone graduating pretty soon - are there tech companies out there that that still have this culture? Everything seems marketing / product focused today. Besides going to graduate school, does anyone here feel like they are at a company like this?
I think within big tech companies (Google/Apple/MS/etc.) you can find teams that have this kind of culture. IMO any team that does serious system programming (Compilers/OS/Libraries/etc.) should have it.
Chatting with national lab folks gave me a feel like this. Especially LANL.
I suppose Bell Labs might feel the same. I have a prof who goes to work there on summers occasionally taking a couple undergrads with him. He is one of the best teachers.
Yes, research labs like CERN have that culture. Startups, particularly those sharing incubator space, are often friendly. (I'm at a startup now, and we play table football every lunchtime).
Oh that brings back some memories. An old manager of mine at Apple had moved into Apple from NeXT with the WebObjects team. We had a lot of black hardware and tons of NeXT docs, etc. and ran the Austin corp NetInfo server. He worked with Jobs for WebObjects demo/keynotes. He told some stories about Jobs, mostly about things he would throw when demos went badly.
I still have a lot of NeXT swag that was eventually given away and have a color turbo slab gathering dust.
I'll try to follow up this eve. I have a S3 Project Team magic 8 ball for my work on the Mac OS X Server 1.0 release (sadly the fluid somehow leaked/dried, but I still cherish it), a lot of magazines and developer docs, and I think some stickers as well as other branded materials.
This is a 3 hour interview with Blaine Garst by the Computer History Museum. Blaine is one of the people quoted in the main link. I listened to it the last time Blaine was mentioned here and it is pretty interesting.
> The Web was happenin’ (invented as you may recall on a NeXT workstation) and we lost
This is true but not entirely.
I have recently for no productive reason become interested in computers built by different nations early in in the computer revolutions.
There was so much competition, so many ideas, so many opperunties.
Anyways I live in Norway now and Norway had a company called Norsk Data that I had never heard of until about a year ago.
They designed their own hardware, and operating system.
For a while they made a "super computer"
Some highlights:
The NORD-5, the world's first 32-bit minicomputer - beating the VAX, often claimed to be first - by 6 years
ND-570/CX, was at the time it was released 32-bit supermini ads i 1983, 7.1 Whetstone MIPS
Their greatest claim to fame aside from what listed above but closely related, they delivered the computers for the CERN
colder back when that was starting up.
They also delivered computers to help create F16 flight simulators.
For a short period of time Norway had best in class computers. I had no idea.
To the relevant part, sorry about the long wait
Norsk Data claims:
"
The World Wide Web originated when Tim Berners-Lee wrote the ENQUIRE program in Pascal on a Norsk Data NORD-10 running under SINTRAN III at CERN.[4] They also used ND-NOTIS, that was based on SGML, and emailed with NOTIS-MAIL, using tcp/ip, coded in HTML.
"
I have only seen ND machines in the museum and on the web.
I wasn't around at the time so its based on Norwegian documents
for the most part.
I do know that there were Nord server used in the Norwegian military up until at least 1995. The ginormous task of creating the replacement system took a very very long time
The very early origins of Norsk Data was at FFI, a defence research agency. (Kinda like (D)Arpa)
Norsk Data were one of the big computer companies in the 80s.
Together with Bull, Wang, Datasaab, NCR, and a few more they pretty much defined minicomputers. They dominated the computing space for about a decade. Then they were undercut from the UNIX companies. Which in turn were undercut about a decade later by microcomputers, PC hardware and things like Linux took over.
In their short run, minicomputers were commercially successful and quickly became entrenched and neither their business models nor their expensive hardware could compete with the later generations of computers. All of the above companies went into bankruptcy or were bought out and stripped during the early 90s.
Computing hardware generations seem to get a longer lifespan though as we are still stuck with PC-style hardware. It has managed to evolve as stay relevant, probably in large because of Intel.
>"We lost our custom hardware. The any workstation you want sprints (solaris, hp, alpha) didn’t pan out. The Microsoft tax (later judged monopolistic, too late, as Ray Noorda of Novell confessed to me as had happened to him) killed our PC business. We ditched the OS and ran on Windows. We sold our source code to Sun to make a multiplatform OpenStep. No cigars. (They made Java out of it using many of our/my ideas)
This is interesting. This is the first I've heard about a link between NeXT and Java. Does anyone have any further information about this? I thought James Gosling developed Java from a language called Oak. That's the earliest origin story I've heard until I read this post.
Java interfaces are based on Objective-C protocols, which were one of Garst's contributions to the language. But I don't think you could say they "made Java out of it".
I seem to remember hearing that there are portions of the original JDK whose API seemed quite inspired by elements in the OpenStep (now Cocoa) API. Not UI pieces, but some utility classes and the like. But I can't think of any specific examples, and I'm suspecting this is more bluster than anything.
GNUstep is Cocoa, not the OS. OPENSTEP the OS (aka NeXTSTEP but with new OpenStep APIs) and OpenStep the thing that became Cocoa are frustratingly named.
Random note, i- for no reason- went looking at WebObjects, which hailed from NeXT I understand. I liked the idea of entity based systems.
Discovered that Apache Tapestry, which I used a decade ago, was inspired by WebObjects. It was a very interesting fairly seamless backend centric web development experience, worked quite well, & I say this as someone who loves JS, thick client architectures, client side architecture. I didn't see a ton of objects seeming like generic web objects like ideas though. In WebObjects it seems primarily like there were objects, then different bindings to re-expose and/or convey updates between the object & the various front ends it might show up on. I'm probably over glamorizing how shared, how web, objects in WebObjects really were.
If only we could get NeXT back... or Apple would build something as well designed, clean and productive. Give us a proper streamlined UI, not this over-spaced baby blur and give us proper APIs for everything. Security is good, but it needs to be handled in a more consistent way, with less trouble for developers.
> The Microsoft tax (later judged monopolistic, too late, as Ray Noorda of Novell confessed to me as had happened to him) killed our PC business.
Do as I say, not as I do.
Apple is every bit as monopolistic as old Microsoft. They need to be forced to open iOS. They sell generic computers you can't run freely, and they're screwing over small businesses that just want to write and sell software.
Prior to iOS, you distributed your program. Now you go through the gatekeeper, follow mundane rules, and pay absurd tax.
Apple is in no way like Microsoft of old. Microsoft did not build generic computers of their own. Other companies like Gateway, Compaq, HP, Dell or Packard Bell did. The Microsoft tax was the agreement between Microsoft and those companies to install Windows on any and all computers they made and pay licenses for all of them. Since Windows had a 95%+ OS market share every computer manufacturer needed an agreement with Microsoft and no generic PC without a Windows license could be bought.
Apple only forces their software on hardware they manufacture themselves. If you want other software, just buy hardware made by a different company. There are still many of them.
So go buy a generic computer that hasn't had Apple's fingers on it. Contrast that to the 90s, when finding a computer without Windows and IE on it was much more difficult. And if you did find one, you'd still pay for a Windows license because that's the kind of deals Microsoft bludgeoned OEMs with.
Prior to iOS, you distributed your program.
And now Apple doesn't allow one to write Android programs? I missed that one.
Your comparison is poor. Starting with the fact that a minority of devices run Apple operating systems.
Apple taught people that apps should only cost $1 and have free updates for life, meanwhile they reaped profits off developers. That isn't healthy, but Apple doesn't care.
I get that you like your Apple device, but this company is destroying our freedoms, making it harder to run a profitable business, and taking advantage of their market position and customer base.
There isn't a lot of room for competition to grow. Their draconian behavior is staunchly anti-ownership. They have a ball and chain around our ankle.
If iPhone had 5-15% market share, you might have a point. But it doesn't. We're running out of freedoms and breathing room. The giants are taking everything away.
Stop worshiping a dumbass phone and the company "protecting you" by taking away everyone's freedoms. It's a stupid little computer - worth far less than our liberty to write code, distribute it, and reuse/upgrade the things we own as we see fit.
Apple has the advantage in that their particular computer is wildly popular and widely used. All it takes is for the DOJ to come and tell them to lighten up - and that's exactly what we need.
A big ol' [citation needed] on that one, because from where I stand The Market(tm) taught people that.
Stop worshiping a dumbass phone
You would do well to watch your tone. I'm merely pointing out that the Microsoft of the 90s, who is a convicted monopolist, is nothing like the Apple of 2020 that you're complaining about. Your follow-up doesn't seem to support your point, but rather just further complains about Apple.
> Apple taught people that apps should only cost $1 and have free updates for lift.
Taking a quick glance at my iPad’s Home Screen, I see PCalc, Omnifocus, IA Writer, Overcast, Drafts, Carrot Weather, Soulver, and some more. A mixture of subscription and non-subscription apps. All very high quality and all of which I pay considerably more than one dollar for.
No it's not at all as monopolistic as MS was. MS was the alpha and omega of software at one point, you couldn't get around them if you wanted to sell to consumers or to businesses.
This is not true of Apple, one can build a business and completely ignore them. Google on the other hand, good luck with that.
> This is not true of Apple, one can build a business and completely ignore them.
These days, there is a very, very large class of "potential businesses" that are only realistically realizable as smartphone apps. In my country, iPhones that can only run Apple-approved apps account for over half of the installed base for smartphones, as far as I know.
Making and selling a better smart phone is not as simple as making a better hammer, and there was never such a thing as a hammer that would only work with nails approved by the manufacturer of the hammer you had. This is really a new situation with no pre-tech analogies. We cannot rely on pre-tech laws to cover it. And it's clear that consumers and small businesses are not being protected very well, if at all, from Apple. I should not be forced to be spied on by Apple or Google to park my car, but this is the situation we are in, since there aren't meters in all places in my city, only an app. A lot of supermarkets are developing their own apps for scan + pay, how long before I can pick between Apple and Google when I want to get food? There are more scenarios like this by the day, and in my mind there is no question we will need regulation to address it.
> These days, there is a very, very large class of "potential businesses" that are only realistically realizable as smartphone apps. In my country, iPhones that can only run Apple-approved apps account for over half of the installed base for smartphones, as far as I know.
That doesn't really follow. You've gone from "you can't sell software/hardware without Windows compatibility" to "in the Apple half of the mobile market, you can only do native apps with Apple's blessing". Apple, even in the phone space, does not have anything like the utter market dominance that Microsoft had; Android is in fact a real option. Likewise, although it is more limited, you can in fact do webapps even on an iPhone without Apple's approval.
But what I want to talk about instead is:
> Like EOF, our database layer that still puts Ruby-on-Rails to shame.
I spent a couple years programming with EOF (the "Enteprise Object Framework", an ORM), and many more recent years programming with ActiveRecord. EOF had a few features that ActiveRecord still doesn't that I miss (like properly functioning multi-table inheritance; and lazy "eager loading" triggered on first access for all associations; Rails 6.1 has a welcome feature to RAISE on n+1 behavior, but why not just lazily trigger the efficient load instead, which is probably no harder to implement? Maybe nobody thought of it, having not used EOF?).
But I wouldn't actually say it still puts ActiveRecord "to shame". ActiveRecord is very similar to EOF in design, by 2020 nearly as mature, with 80-90% of the features.
Yeah, it's striking that ~20 years later we can say AR is mostly as good as EOF haha (and doesn't have anything of note that EOF didn't already have, it hasnt' superceded it in any ways). It's internal architecture isn't quite as elegant. But it really is nearly as good as EOF, it's deficiencies compared to EOF aren't large enough to be particularly shameful, in my experience/opinion, it's in the ballpark!
AR is so similar to EOF that I have always wondered if some of it's designers had experience with EOF.