movq

Wherein I Move a Lot of Words Around

Time Machine Quotas

Quick tip for Time Machine backups: Mac OS X Server ("Server.app" these days) can setup a quota for Time Machine backups that is specific per-machine (rather than per-user). This trick works on any share, actually, since it's a root-owned property list in the share directory. You'll want Mavericks or later as a client for this to work, but to set it up cd into the root of the Time Machine share and create the plist with (for a 500GB example):

/usr/libexec/PlistBuddy -c 'Set :GlobalQuota 500000000000' .com.apple.TimeMachine.quota.plist

On the server, make that owned by root:wheel and chmod 644 et voila! You have quotas. No need for "real" quotas, LVM, or other fun. Works over SMB and AFP.

The Forgotten

After watching yesterday's Apple Event and reading around a bit at the reactions, I've become concerned for the future of the Mac, at least in the hands of the current leadership at Apple.

For a long time we — creatives, power users, and developers, the "Pro" in the product names — felt the fear that Apple's success in iOS would manifest itself with a locked-down Mac and candy icons on the screen. While that does not appear to have passed, something far more damaging has: Apple completely forgot why people used Macs.

There's a Tim quote that's seen a resurgence in the past day:

“I think if you’re looking at a PC, why would you buy a PC anymore? No really, why would you buy one?”

Tim Cook, talking about the iPad Pro (from Charged Tech)

That one quote speaks volumes. Perhaps it wasn't meant to, in fairness. Perhaps it was off-the-cuff and meant only to say the iPad Pro was a lovely little tablet and it solved the needs of all the casual users in the world. But that the thought was there to be spoken also says that it's been said a lot internally, in one way or another.

It seems to me that Apple's distinction between the Mac and iOS is purely about how we interact with the device — a chair and keyboard/pointer versus lounging and casual poking around — instead of realizing there's a fundamental reason to why there are different interaction patterns. They appear to have only a topical understanding of why people use Macs instead of the deep understanding of their customer base that was once apparent in the design.

Four years. It's been four years since Apple last significantly updated the MacBook Pro. I own that machine and I'm typing on it now (Early 2012 R-MBP). What's really shocking about this machine is that it still stands up to machines today. It is powerful and progressive and a statement about future-proofing portable computers. It's thin, but not so thin it does nothing. It's powerful, but not so powerful it doesn't work most of the day. It's light, but not so light that it's missing significant features.

However, even it missed some marks. As a part of Apple's new serial port-murdering mentality the ethernet port was removed from the computer. At the time I recall hearing someone in the presentation say: "When was the last time you used wired networking? Wireless is the future." Wireless is the future, yes. The problem with the future is that it's not the present, and by removing a port from a single product the world will not change overnight. Corporations will not change their massive network deployments and port-level security. Homes will not suddenly be covered in wireless in every corner. Hotels will not suddenly have reliable and secure wireless networks. Even today I wind up using wired ethernet daily in some fashion.

It was a mistake, and not the only one. That model also hard-wired the RAM to the motherboard, switched to PCIe storage, and sealed the computer. The previous model was delightfully expandable with just a small tug on a lever to expose the battery, 2.5" SATA bay, and the SO-DIMMs used for RAM. Gone. No upgrades. Buy today what you'll want in five years and hope the battery lasts. The iMac and Mac Mini were hit with similar reductions in expandability (though they kept their ethernet ports).

In short order new MacBooks lost just about every other port on the machine and they even toyed with the idea of just one port, merging power and connectivity. (Think about that one, because they did it again with the iPhone 7 recently.) They removed all connectivity from the MacBook when it was being used at a desk and then all ability to charge when it was in active use with an external device that needed power. If I were in the room in a position of power when that was suggested I'd have one more junior whatever on staff in short order. It's a very rare lifestyle that never, ever uses a computer at a desk without anything connected to the machine. Of course, there was a solution to it. You guessed it, a dongle that accepted power and exposed a USB-C port. The same idea they used on the iPhone 7. People hate it.

Now we come to today. Four years after the previous mostly successful redesign (I'm still using my Thunderbolt-Ethernet dongle). What got announced? All ports are going away and being replaced with a connectivity standard that very few first-parties are actually using. For just about everything you need to do with your computer, you're going to have to live a life of dongles. Whatever port you were using and relying on is dead. No peripheral you currently own can be attached without a dongle.

Do you use an external display with Mini-DisplayPort? Dongle.

Do you use an external display, projector, or TV with HDMI? Dongle.

How about an external hard drive, thumb drive, keyboard, mouse? Dongle. Dongle. Dongle. Dongle.

Hey, do you have an iPhone or iPad? Dongle! Or purchase the USB-C to Lightning cable that nothing comes with right now. $25, please.

You know, that's enough dongles that maybe you just want one super-dongle to rule them all. Well, those exist as well.

(By the way, none of those support power delivery to the tune of 87W, so they can't charge the new MacBook Pro. The OWC one comes close at 80W, but is still a bit shy. I would not pre-order those with your new MacBook Pro USB or you'll be very disappointed and have to plug in two USB-C cables.)

What this means, dear reader, is that if you are one of The Forgotten then you have to drop another $200-400 to purchase all the ports that Apple removed. It's like system building with expensive and wobbly LEGO bricks, and has all the appeal of that phrase.

If this were a situation where you'd buy one or two of them and be done with it, it might be okay. Might. But this isn't that case. You'll need a USB-A adapter (or three), a video adapter of some kind (unless you drop $2K on the 5K LG display), perhaps a wired ethernet adapter (always have one to be safe), and an SD card reader if you're a photog (BTW, get another USB-A adapter and leave it plugged in to that thing). By the way, if you ever have to load photos from a card, save them to an external disk, preview them on an external display, and then upload them over a wired connection (not too far from a real "Pro" workflow) then you aren't charging your Mac while you do this as you just used all four ports for dongles.

Apple thinks their Pro users are Pro Video users. Users who use Final Cut Pro (though many/most have moved away after the Final Cut Pro X fiasco) and need desktop RAIDs and large displays … and that's it, save, perhaps, an audio user or two. Folks like me who have been using computers for many decades and connect all kinds of things to them are generally left out.

However even common office workers are left out to a degree (ethernet! video!). Apple wants us to move to a dock-and-run lifestyle but they are unwilling to make the dock or make it easier to run. They used to make a dock but then they killed it with no replacement, instead pointing users to a third-party display. Even then, another $2K for those missing ports is obscene. The reason, of course, is that they bundled them with a screen.

Don't do that. Don't ever do that. If you see that, don't ever buy that. You've just entangled the lifespan of your display to the lifespan of the specific ports that it supports. If 10g enet somehow becomes a household standard then you're out of luck. USB 4? Nope. Thunderbolt 4? You're stuck, even if that display is still wonderful and otherwise useful. (I'm saying this as I look at the disconnected Thunderbolt Display sitting on my floor because it's utterly useless on anything except my Mac. I use three desktop platforms and have limited desk space so everyone shares one display. HDMI KVM FTW. At least, until yesterday.)

But it's thinner. It's using Skylake (not Kaby Lake, mind you). It's powerful and wonderful and does magical OLEDy things.\ We can forgive them these slights because they did them in pursuit of thin and sexy, right?

Except that thin and sexy has been done without removing ports, fully utilizing Thunderbolt 3 and USB 3, and even incorporating a nearly desktop-class GPU. Have you seen the Razor Blade Pro yet? You should, especially if you're OS-flexible. Even the Razor Blade Stealth managed to pull it off better than the MacBook Pro 13" did (still has USB-C, USB-A, and HDMI). Both include high-resolution wide-color displays (different gamut, but still wide-color). Both are actual touchscreen computers as well. They didn't even have to murder the Escape or Function keys to get it. (I'm in the CLI all day so that hurt, but I know I'm a rarity on that one)

They made a reference to the original PowerBook in the show yesterday. It actually made me sad to see it because the PowerBook in general was so much more of a computer than the toy they still today. It was so much more expandable and versatile than the machines they sell today — especially during the Wall Street and Pismo years. Need two batteries? Do it. Need a battery and an extra drive? Do it. Need two drives and you're on wall current? Do it. Today? Can't do it, use another dongle. Out of ports? There's a dongle for that so you can add more dongles (a USB hub).

I can't help but feel Apple has decided the core audience of their Unix-based powerhouse OS is the latte-sipping children in campus coffee shops and anything at all about their systems that appeals to anyone else is just something to be removed in the path to a sheet of paper with nothing but content. Frankly, it's that total disconnect between what computer users want and what mobile users want that has me worried about the Mac. The source of my fear — after much contemplation — is that the same people that design the Mac are designing the iOS devices, and that's a horrible situation for both platforms.

Related reading:

On Gaming Input Devices

Having the desire to upgrade my input devices at the home, I started looking around for a good keyboard and mouse combo. While the business-oriented lines were nice in their own ways, they lacked a certain flair and were woefully short of buttons and standard layouts. (What's with everyone screwing with the standard keyboard layout? Stop it. I like my buttons.)

As a result, I started to look at the gaming series of devices. I'm not sure how I wound up looking at them, honestly, but once I started to look at the options it was clear to me that all the attention on making input devices better at a hardware level was going into that market instead: the keyboards were mostly mechanical, the mice were high-DPI and loaded with buttons, and the quality was far and away higher — as were the prices, of course.

After some period of research I picked up the Corsair K70 keyboard and Corsair M65 mouse. Neither is too gratuitous with the lights off, and both are quite helpful if you setup the lights accordingly. By which I mean: think back to the 80s and keyboard overlays for Lotus. Like that, but with colors. So when switching to a game you can have the lights come up for the keys you normally use and then color-code them into groups (movement, actions, macros, etc.).

When using it for daily stuff in Windows, and some games, it proved quite the nice combo. The mouse has buttons to raise and lower the DPI and a quick-change button at the thumb for ultra-precise movement (think snipers in an FPS game or clicking on a link on an overly-designed web page where the fonts are 8pt ultra-lights — yes, I really used it for that once).

However, I quickly found the Corsairs had a very large weakness: they literally only worked in Windows. I don't mean the customization and macros, I mean the devices themselves did not show up as USB HID devices in Linux or the Mac. They failed to be a keyboard and mouse on every other computer I have. Normally, I would blame this on my KVM's USB emulation layer, but I connected them directly and nothing changed. That is, until I read the manual and discovered the K70 had a switch in the back to toggle "BIOS mode". Now it worked, but I lost some customization and the scroll lock flashed constantly to tell me I wasn't getting my 1ms response time anymore (no, can't turn that off — flashes forever).

To add to the fun, the keyboard has a forked cable. One USB plug is for power and one is for data. If you connect the data cable to a USB 3 port on the computer itself then it can get the power it needs and you don't need the other. If you use a KVM or USB 2 hub then you're using both.

Overall, the frustrations outweighed the utility and I returned them both. I did some more research and found that, of all companies, Logitech fully supported their gaming devices on both Mac and Windows and their devices started in USB HID mode and only gained the fancy features when the software was installed on the host machine.

Taking that into consideration I went ahead and picked up the G810 Orion Spectrum keyboard and the G502 Proteus Spectrum mouse.

To summarize the differences:

I'm especially happy that both keyboards had a dedicated button for turning the lights off when I just wanted a good mechanical keyboard and back on when I want to do something that it adds value to. That's a nice selling point for both, really.

At any rate, if you have a Mac, it appears only Logitech still cares about you. That's perfectly fine with me as they make some good stuff, overall.

WQHD, DVI, HDMI, Oh My

Even though I knew that video modes were a nightmare mess that was made barely tolerable by standards, I had no idea the hell that awaited once one passed the 1200p edge.

A short history of video modes before we begin (this helps the pain later). Video is a three-dimensional concept that must yield to the laws of computer science and become a two-dimensional bitstream of arrays in order to go down the wire to the screen (and also in order to be stored in memory, but let's not complicate things more). You may be wondering if I added an extra dimension in the previous sentence but I did not -- that additional dimension is time.

In computer science a list of numbers is an array (stop it, Melvin, you know where I'm going with this). A two-dimensional array is a list of lists (think of an outline). A picture, or a frame of video, is a 2D array of pixels (and each pixel is an array of component values -- usually RGB). You can quickly see why pictures take up a lot of space:

(0,0,0), (0,0,0), (0,0,0), (0,0,0), (0,0,0), (0,0,0)
(0,0,0), (0,0,0), (0,0,0), (0,0,0), (0,0,0), (0,0,0)
(0,0,0), (0,0,0), (0,0,0), (0,0,0), (0,0,0), (0,0,0)
(0,0,0), (0,0,0), (0,0,0), (0,0,0), (0,0,0), (0,0,0)
(0,0,0), (0,0,0), (0,0,0), (0,0,0), (0,0,0), (0,0,0)
(0,0,0), (0,0,0), (0,0,0), (0,0,0), (0,0,0), (0,0,0)

That's just a 6x6 image filled with black. But it's formatted for humans. A computer would see that closer to:

000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000

That raises a very important issue: how big is that, and what's what? It's just a mess of zeroes without some kind of structure around it to describe what's what. That's metadata. Specifically, it's a glob of metadata that conforms to a pre-set standard and describes what kind of pixel description is being used, how long a row is, and how many rows make up a single image. Using this information, a computer can break up that bitstream into chunks at the row boundaries, decode the pixel colors, and then draw row after row on-screen and create an image from it.

As one might imagine, it's very important to get that description correct. If you cut at the wrong boundary or your pixels are a different format (or size!) then it all goes to pot fairly reliably. There are ways to detect when this goes haywire, but it's better if it doesn't in the first place.

That brings me to video. So take the above and then add in metadata about time. That is, how many frames will be delivered in one second, what a frame's data will start with (just in case a partial frame is delivered -- don't worry about it), and what the current stream time is for any given frame (that is, when to display it in relation to other frames should it need to hold a frame on-screen longer than another one). There's a lot of other stuff, but it starts to get rather complicated after this.

To simplify all of this, there are video modes. These are pre-agreed upon configurations of all of the above that two devices can use to simplify this whole process. You've probably heard them by the name "resolutions" but it's the same thing (Melvin, hush).

The standard-bearer resolution at one sad point in time was 640x480 for the Video Graphics Array, or VGA. Realizing this sucked rather massively, the industry moved to 800x600 and slapped "SUPER!" in front of it and we got SVGA. Both of these supported the mind-boggling limit of 16 colors due to an infinitesimal pixel storage size. Because the set of computer scientists doing video did not intersect the set of artists in the world, black and white are considered colors for the sake of this discussion. You got 14 colors, and you liked it.

Realizing this wasn't really so "super" the 1024x768 resolution was put forth and slapped with the fancy XGA name. Among the improvements was a widening to 8 and 16 bpp which allowed for 256 and 64K colors in those modes. Things got crazy after that and Wikipedia can tell you all about it should you decide to inflict that madness upon yourself.

The Point, Please Get to It.

As the frames got larger and the pixels wider and the refresh rates faster, there was an increasing demand put on the connection from the computer to the display. With the old VGA connector type it was not-trivial-but-not-impossible to keep adding these new modes to it because it was essentially broadcasting the video signal over the wire as a waveform. Enhance the transmitter and receiver and voila! -- more pixels.

Digital was quite different. After VGA (the connector) came DVI -- Digital Video Interface. It's a mess of a standard because it was bridging the analog world into a brave new digital era, but it did impressively well for what it was. However, the digital connection was limited by available bandwidth to 1920x1200 (if you've been keeping up with the names, prepare for a winner: WUXGA -- Widescreen Ultra Extended Graphics Array; when in doubt, add a word). That means that if you want to power a higher-resolution screen then you'll need two of those links to get the bandwidth needed to break past that limit. More on that later.

When DVI was made it came in several flavors, but the most common were DVI-A (analog), DVI-D (digital), and DVI-I. The I in DVI-I stands for Integrated, which is the nice way of saying "we considered all you laggards and included some analog pins in here so we can deprecate that god-awful VGA adapter and you can just hook up an adapter to extract the analog signal and power your radiation box that way". It was a really efficient naming scheme, I think.

While they were busy shoving pins into this connector, someone had the bright idea that displays might keep getting larger and more dense and it would be nice to consider the future just a little. So a collection of pins in the middle were designated a second digital "link" and could be used to shove more bits down the pipe. Using both links, you could drive about 4K pixels sixty times a second (2560x1600@60Hz). Thus we have Single-Link DVI and Dual-Link DVI.

Still Not Seeing a Point

Because displays could live on the sustenance provided by SL-DVI for many years, lots of things just didn't support DL-DVI (or DVI-A/I for that matter). That means things like displays that didn't need it, video cards that couldn't push it, cables for those folks, and KVM switches for the same reasons (oh, and cost). If you were the lucky owner of a 1920x1200 display then you were perfectly fine living in the Single-Link world. That includes all common HDTV formats, by the way, so a very large number of people are in this bucket (HDMI is compatible with SL-DVI for the most part).

But displays did, in fact, grow larger. As they grew larger, the previous version of larger trickled down to a more common position of the distribution curve and now we're seeing a video mode grow in popularity that hits this pain point right in the Y-joint: Quad XGA (aka Quad HD), and Widescreen Quad XGA. If you take 720p video and double the height and width, you'll get Quad HD (2560x1440). If you take that 16:9 signal and make it 16:10 you'll get WQXGA (2560x1600). Remember that resolution? Yeah, that's the limit of DL-DVI. We found the edge of the world (again)!

(By the way, you may have also heard it by another -- trendier -- name: 4K. That refers to the number of pixels per frame and is a nice shorthand. Another name for 1080p is 2K video, along the same line.)

The Point (Really!)

I bought one of these bastards. Namely, I bought a 32" QHD display to replace my 27" Apple Thunderbolt display because I wanted to use it with more than just my Mac (seriously, Apple?). With a DisplayPort to Mini-DisplayPort cable the Mac runs it just fine. With a DL-DVI, DisplayPort, or HDMI (1.3+) cable the other computers can use it just fine. With a SL-DVI cable -- or a DL-DVI cable running through a SL-DVI KVM -- I get 1900x1200.

It turns out that this really sucks. It sucks more that the DL-DVI KVMs for four computers (really) run about $400+, the DisplayPort KVMs appear to be limited to exactly two hosts (WTFSRSLY?), and HDMI KVMs only advertise Single-Link resolutions.

However, if you read the descriptions of the HDMI KVMs you'll notice something funny. They advertise being HDMI 1.3 switches. Yes, in spite of saying that they only handle 2K video, they are using the 4K standard and will happily support switching your lovely 4K display (or near-4K). Also, since HDMI includes full 7.1 audio, you can switch that as well.

So, very long story short: if you need to switch your 4K display for a price under 4K, use HDMI to connect it instead of DisplayPort or DL-DVI. In fact, if you handle the KM part yourself, you can get a 4K HDMI switch for about $30, and it'll do PIP and have a remote as well.

In the end, it's all about the bandwidth and the associated video mode standards. If the pipe is big enough, the pixels will flow.

Auto Unlock Requirements

After installing the GMs to Sierra and Friends I was eager to try out the Auto Unlock feature with the Watch as passwords generally suck and re-entering them time and again sucks more.

With all my devices on the same Apple ID and updated, I went to the Security prefs on the Mac and lo … no option for it. After reading around I learned all the devices must be marked as Trusted in iCloud, which means you need Two-Factor Authentication (not Two-Step). I set this up and re-added each device to iCloud until they were marked appropriately. Still no option on the Mac.

Then I read around a little and discovered the requirements for this feature, which I couldn't find on Apple's site anywhere. While early release notes to the betas said any 2013 Mac would work, it turns out that only mid-year and later models will work — specifically those with the 802.11ac network card.

I ain't upgrading for this. Luckily, other folks have some solutions for this that work with less-stringent requirements on how recently your Apple tithe was paid, such as Knock and MacID.

Is It Illegal to Make Your Spouse Ride on the Roof of the Car?

Is It Illegal to Make Your Spouse Ride on the Roof of the Car? | Lowering the Bar

This question arises from the recent arrest of a Florida man (credit: The Smoking Gun) after he was stopped by a police officer who wished to inquire as to why there was a woman clinging to the roof of his car. The answer to that question, as you might expect, turned out to be complicated.

Every Developer's Nightmare

State: Answers were erased on 14,220 STAAR tests | www.mystatesman.com:

State officials are threatening to reconsider a $280 million contract with its testing vendor after answers to 14,220 state standardized tests were erased because of a computer glitch last week.

Programming Sucks

Also, the bridge was designed as a suspension bridge, but nobody actually knew how to build a suspension bridge, so they got halfway through it and then just added extra support columns to keep the thing standing, but they left the suspension cables because they're still sort of holding up parts of the bridge. Nobody knows which parts, but everybody's pretty sure they're important parts.

Programming Sucks

Every project I've ever worked on has this smell somewhere.

Not Paying Attention

People like to think Microsoft is the dean of proprietary software companies. Nonsense! Microsoft is making serious investments in open-source software. Apple, though, now there's a company that likes to lock down its code.



Apple's Swift Comes to Linux - ZDNet


Likes to lock down their code? WTH? Pay attention to that which you critique. The OS is open source, the compiler has always been open source, WebKit is open source, the core frameworks are open source, and they publish all changes to GPL code as they should. Seriously, what's your standard here, especially when comparing to MS?

I should expect this from ZD though. I think they're beating CNet to the bottom.

The Academy and Diversity

The issue is larger than the folks running the show can fix. Their members vote based on what they see as talent. Their membership is not at all diverse. Even this, though, isn't in their control. They mainly have A-level members with some scattering of Bs. Folks at that level trend towards the pale end of the spectrum as a product of the viewership's perceived preferences ("Ain't no white family going to see a movie with a black lead!" uhh, Lethal Weapon? The Matrix? A hundred others?)

So if we want a villain – and who doesn't? – it's the casting and hiring directors who are using outmoded descriptions of the viewership (from their producers and other management) from god-knows-when that limit the influx of diverse talent into the industry to begin with.

Fix that and then the cast and crews become diverse (as the people on-screen in the theaters become more diverse). Then they get their membership and vote. Then the nominees reflect this.

It's a huge problem, and not at all fair for anyone on any side to blame them for what their members chose. It's like blaming the county clerk for who won a political race. The voters chose. Fix the voters.