To be fair, it's pretty expensive breaking them into chunks and packaging them individually and they're turned pin ones by the looks. This isn't massive production run bags of them.
Yes, but Amazon's schedule involves them having advance, non-public knowledge of major security flaws. In some ways, you've got a false sense of security.
Everything else is yours too, like getting an ROI out of the capital your spending. It's nice you have that kind of money laying around to spend on iron instead of advertising (as an example), but not everyone does.
Or just buy an older ThinkPad X201 / X220 / T410 with 1440x900 display and get a new 9-cell battery.
The things are really cheap, have a proper keyboard, are bomb proof, every part is replaceable for minimal cost, they are 100% supported by all operating systems, have excellent docking support and you can just sling them in your bag. Mine lives down the back of the sofa cushions when not in use.
I knackered the headphone port on mine from overuse. 5 mins on ebay, £7 spent, new board installed in 5 mins with a Philips screwdriver and the service manual.
For every day tasks, I use an i5 X201 with 8Gb of RAM and a Samsung 840 Pro and it's no different to the stacked HP Z620 I have in the office to use. If I need more power then I use that remotely.
Totally awesome machine and I paid virtually nothing for it.
PC tech is moving so slowly now it's a better investment to buy two older machines so you have a backup unit than one new one if you ask me.
I've always been curious--what's the use case for the backlit keyboard?
At the surface, backlighting seems largely aesthetic to me; I have muscle memory of my keyboard layout and do not need to reference the keys at all while in the dark. (A huge tactile cue is the TrackPoint in the keyboard.) I understand that there are lots of variables: not everyone uses their machine as frequently or may not have a keyboard amenable to this.
Does anyone here feel backlit keyboards are essential? Why or why not? What is the consensus rationales?
Anecdote. I had a 2011 MBP with backlit keyboard. Never used it once. In fact, the machine ended up with windows on it because MacOS made me want to smash the thing (despite being a Unix guy) and the Boot Camp drivers didn't switch the backlight off reliably which annoyed the crap out of me.
If you're sitting in a place dark enough to need it, it's not good for you staring at the contrast between the screen and darkness so turn the light on.
I never understood the turning the lights on bit I found bright ambient environments hurt my eyes more when trying to concentrate looking at the screen then dark environments.
Then again most websites I use regularly have been re-skinned to have dark themes since white hurts my eyes regardless of the ambient environment.
I tend to use the backlight on my keyboards to align my fingers and when searching for symbols.
I never felt the need to memorize where all the symbols are located since they change locations depending on keyboard type(language) anyway and i can't be bothered to.
Whilst not quite PDP11/VAX territory (I do love VAXstations though!), the killer for me was I bought a low end Sun Ultra 30 in 1999 just before they EOL'ed them and spent 3 years slowly upgrading it. Spent about 4000 GBP on it in total.
Still felt faster then than the i5 X201 ThinkPad on my lap today. Everything was smooth as butter, even though it was Solaris 2.6.
This is not hard work. I'm not sure exactly what was being done that made this take more than an hour.
On our kit, which is pretty hefty, I'd schedule a 30 minute window for a 150Gb ETL even with heavy transformations.
When we moved our stuff from Mongo to SQL Server it took an entire 6 minutes to drag 30Gb over including SQL full text. That was a 16 core Xeon with 64Gb of RAM and 15K SAS RAID 10 so not a big machine either.
This. I've had several smartphone devices over the years and not one has been bent in my pocket, even when I sit on it. I don't expect it to happen, even on a premium device.
I would suspect that anyone whose email did not match that regex would have such a miserable time generally getting it rejected as invalid left right and centre that they would just cave in and get a simpler one that did.
Not the only one. Someone set up NHibernate on a project I was working on to run logging in DEBUG mode and throw it in a log file capped at 100k. Pages were taking 4-5 seconds to render. Wasn't obvious that the logs were being written as the file was capped at 100k.
Wouldn't most storage types, including regular hdds and optical media fail this test though? I am not talking about complete data loss, but some data would be corrupted after such time.
Magnetic media (hard drives, tape) will essentially retain data indefinitely unless exposed to magnetic fields that are strong enough or until the Curie point is reached, both of which are unlikely scenarios for long-term storage. Even in cases of fire that destroys the external components of an HDD, if the platters didn't get hot enough the data is still there:
Flash-based memory is different - unlike magnetic media which can be thought to be bistable, flash is inherently unstable (monostable); the erased state of a cell is lower energy so the electrons stored in a programmed one are "under pressure", and due to tunneling effects, slowly leak out over time.
The consequence of this is that magnetic media will continue to store information long after it's obsolete; I'm almost willing to bet that the data on a modern HDD will still be there on the platters in 100+ years, even if the rest of the drive becomes inoperable. Ditto for optical media such as pressed CDs - in that case the bits are manifested physically, and unless the medium is degraded to the point where the bits are no longer distinguishable, the data stays (theoretically, even a CD whose reflective layer has degraded is still readable via SEM or other physical means, since the data is physically pressed into it.) On the other hand, flash will slowly and irreversibly erase itself over long periods of time, as each cell returns to its non-programmed stable state.
This is not exactly true, the magnetic fields on HDD migrate around on the disk over time and eventually become unreadable by the disk. In theory the data remains recoverable for significantly longer than that but it's not 'stable'. While historically not much of a problem it's a larger issue as HDD keep increasing in capacity.
HDD actually internally refresh data to avoid this issue so their much better as 'hot' storage. Tape is designed to avoid most of these issues and is much better for long term storage.
"Magnetic media – such as floppy disks and magnetic tapes – may experience data decay as bits lose their magnetic orientation. Periodic refreshing by rewriting the data can alleviate this problem. " http://en.wikipedia.org/wiki/Data_degradation
Decent quality hard disks and DLTs will probably be fine. I've had DLTs survive 100oC for over an hour inside a fire safe in a fire and some old Seagate Cheetah 10K U160 SCSI disks in an external SCSI case with a dead fan running at 80oC for a month quite happily. The latter was actually running and operational which is remarkable.
Some optical media will fail. It doesn't even have to be kept at 75oC from experience.
Flash will almost certainly fail (leakage increases with temperature). I've had dead CF cards, USB sticks, SD cards, the lot and corruption after only a couple of years stored in ideal conditions. Then there's the old Sun PROM crapfest to consider as well...
These are all anecdotes, but there is data out there to support this as well.
"I've had dead CF cards, USB sticks, SD cards, the lot and corruption after only a couple of years stored in ideal conditions. Then there's the old Sun PROM crapfest to consider as well."
The story goes back much further with solid state if you talk to "retrocomputing enthusiasts" unfortunately eproms reading all 1 awhile is depressingly common. You only need a single bit error of course for software failure.
Another interesting point to bring up is it is Very well known in the retro community that the details of eprom programming technique and certification (or lack thereof) of eprom programers has a dramatic effect on burn lifetime, like multiple orders of magnitude difference in burn lifetime. Anyone can make an eprom burner that verifies an hour later. Much harder to make an eprom burner that verifies 10 years later.
This affects more than stereotypical retrocomputing due to embedded devices. Plenty of PBXes and machine tool controllers and scientific instruments and classic video arcade machines get scrapped because the eproms lost their minds.
Very good points there. I worked for an aerospace and defence company for a bit as an electrical engineer. Our software was always read into RAM, checksummed, the RAM was write protected via a register and only then the code was executed. The bootloader was a mask ROM. That was all just to work around the possibility of bit flips in EPROMs.