Month: April 2015

Dawn Chmielewski, Recode:

Teardown Shows Apple Watch Sport Costs Just $84 to Build

Adrian Kingsley-Hughes, ZD Net:

Apple Watch Costs Under $85 to Make

Shaun Nichols, the Register:

What Is Apple’s Idiot Tax on Watch These Days? ‘About $265 or 80%’

All of these reporters simply echoed IHS iSuppli’s suspect claims as fact, or nearly so. They really think that the profit margin on the Apple Watch Sport is 70-80% — never mind the margin on the higher-end models. Not one of them seriously questioned the estimate; the closest any got was acknowledging that R&D costs are not included in the estimate. Everyone knows these estimates are a crock of shit, but this takes the cake.

Tim Cook:

“I’ve never seen one that’s even close to accurate,” Cook said of the oft-quoted estimates. His relatively strong reaction was prompted by a question regarding perceived weakness in launch margins for the Apple Watch.

Cook, continued:

On why Q3 Apple Watch margins may be smaller than people outside Apple expected

In the first quarter of any kind of product you would always have learning and these sorts of things. We’ve had this with every product we’ve ever done. And so, again, we’re not guiding to what it will be over time. We’re talking about what it is now. I would keep in mind that the functionality of the product that we’re making is absolutely incredible, the power of it. I haven’t even seen those, but generally there are cost breakdowns that come out around our products that are much different than the reality.

You think you’ve seen miniaturization? You’ve seen nothing yet. Keep in mind that the scale below the chip package is in centimetres; it’s less than one inch across.

Ben Dreyfuss responds to the utter ridiculous Emoji Horse Violence Scandal of 2015 for Mother Jones:

Is merely mentioning the reality that horses are shot when they are lame outrageous? If you are outraged by the fact that horses are shot when they are lame, be outraged about the fact that horses are shot when they are lame, not someone remarking on the fact that horses are shot when they are lame.

I am as ardent a defender of equality and as passionate an opponent of prejudice and discrimination as you will ever meet, but even I am increasingly finding the internet uptight and humourless. Jokes don’t need to be based around a degrading or deliberately offensive attitude, but every tweet is now picked over to find something outrageous. There are lots of reasons for you to be justifiably concerned; don’t conflate those with off-the-cuff jokes. They’re not worthy of your constructed outrage.

Here are the first six paragraphs of Mark Gomez and Patrick May’s story on an area interest story for the San Jose Mercury. See if you can spot the error:

An iPad “test model” was one of the items taken during a robbery and kidnapping at a Cupertino house earlier this month, according to the Santa Clara County Sheriff’s Office.

The sheriff’s office, which would not disclose more details about the stolen device, but said it has not been recovered.

It’s unclear whether the Apple item was related to an upcoming product release or was an outdated model or test device.

“We are still investigating everything about this case,” sheriff’s spokesman Sgt. James Jensen said.

The robbers took the device, along with electronics, prescription drugs and cash valued at $7,500, from a Cupertino home during an incident in which a 20-year-old man was kidnapped and robbed after answering a woman’s online advertisement.

Authorities on Tuesday said the victim told detectives that “a test model iPad from Apple” was taken along with the other items.

In the first paragraph, the kidnapping is mentioned, but it takes until the fifth paragraph before the fact that a human being was kidnapped is acknowledged. And then Gomez and May jump right back into the crazy stolen iPad model in the sixth paragraph.

Here’s how I imagine the conversation between police and Apple legal went:

Apple: Hello, this is Denise in legal speaking.

Police: Hi, yeah, this is Sgt. Jensen with the Santa Clara County Sheriff’s Office. I just wanted to let you know that we noticed one of your prototypes was stolen from the home of one of your employees after he was kidnapped, and we haven’t been able to recover it. We know you’ll wanted it back, but we’ll need it for evid—

Apple: Wait, back up. One of our employees was kidnapped? Is he okay? How’s he doing?

Police: Good, yeah, fine. Hey, listen: I’d love to know what new features are on it. You know, so we know what to look for. Can I run two apps at once? Is it really light and thi—

Apple: Are you crazy?

Or something like that, if the police have similar priorities to the Mercury and the Register and PC Magazine and CNBC and Le Soir and the rest of the news outlets, apparently.

Mat Honan, Buzzfeed:

We did once. We believed it about IBM and we believed it about Microsoft and Google and Apple and Amazon. But what do we believe it about today? Is it Snapchat? Is it Uber? Is it Facebook? (It is probably Facebook.) Is a chat app really worth billions of dollars? Really? Well, sure. It is because people say it is.

So what happens when they say it isn’t?

Anyone who pretends to know the future is fooling you. Or themself. We are barely muddling our way through the present.

Proved today, with the “sunsetting” of Secret:

Alexis Ohanian, a founder of Reddit who invested in Secret, said at the time that it showed signs of being a contender for the future of social networking beyond Facebook. “Apps like Secret become an outlet for people to speak honestly about things that would otherwise result in career damage,” Mr. Ohanian said. He did not respond to a request for comment on Wednesday.

DisplayMate has published their preliminary analysis of the Apple Watch’s display, and they’ve discovered some new things about it which Apple chose not to publicize. Most significant, I think, is the internal calibration and accuracy of colour. I was worried that the OLED display would create significant oversaturation, but it looks like the colour profile is tuned to be similar to that of the very accurate iPhone 6.

But the biggest question is how the display of the Watch Sport compares to the Watch, the latter being both laminated and covered in sapphire. The title of DisplayMate’s analysis — “Apple Watch Display Technology Shoot-Out” — implies that they got their hands on at least one of each model and have subjected them to a comparative battery of tests. But that’s not the case: DisplayMate picked up only an Apple Watch, and they’re using the iPhone 6 for reflectivity measurements for the Sport. While the Sport does use the same kind of glass as the iPhone 6, the latter’s display is laminated, while the former’s is not.

At best, then, this isn’t a “shoot-out” between different Watch displays; it’s an initial analysis of one of the models. But that didn’t stop Buster Hein of Cult of Mac from proclaiming that the “Watch Sport has [a] better display than pricier models”. Not only is that not what DisplayMate is claiming, that statement is based solely on one assessed factor. While it’s true that this factor affects other parts of the perceived display quality — more reflected light reduces display contrast — the sum total of all factors might put the Watch’s display over the Sport, but we can’t know that for sure unless the Sport is put to the test. And even if there is a measurable difference, the perceived difference might be very little or none at all.

Serenity Caldwell bringing the mad transcription game for iMore. Tim Cook on Apple’s environmental efforts:

Last quarter, we also announced a major economic investment in Europe, where we will spend two billion dollars to build data centers in Ireland and Denmark. These will be our largest data centers in the world. […]

The two data centers we’re building will run on 100 percent renewable energy from day one. This is just part of the work we’re doing to protect the environment and leave the world better than we found it. Today, 100 percent of Apple’s U.S. operations and 87 percent of global operations are powered by renewable energy.

This really isn’t about the bloody ROI. This is going to cost Apple a shit-tonne of money, but it’s a good move for lots of reasons. And, in their case, $2 billion isn’t a lot, especially when both data centres will be paid for by money that is owned by Apple Distribution International, not Apple Inc.

On lacklustre iPad sales numbers:

So my belief is, as the inventory plays out, as we make some continued investments in our product pipeline — which we’re doing, that we already had planned and have had planned for a long time — between that, the inventory playing out, the enterprise starting to take over, I believe the iPad is an extremely good business over the long term. When precisely it starts to grow again I wouldn’t want to predict, but I strongly believe that it will.

My hunch is that the iPads coming up this year are going to be something really special. Radically different, conceptually, especially in software.

On Apple Pay:

Best Buy, which has been a longtime strong partner of ours, has just announced that it’s now offering Apple Pay in-app, and later this year will offer Apple Pay in all of their U.S. stores.

Remember the halcyon days of Best Buy’s exclusive loyalty to CurrentC, all of six months ago?

At least Samsung improved the horrifically ugly front camera and sensor arrangement of previous Galaxy models. It’s really hard to imagine any of Apple’s competitors developing their own typeface, then using it for, among other places, the serial number on a battery no customer will see, though. They — and pretty much all of their competitors — have got a long way to go before they can even think of mimicking the “Designed by Apple in Cupertino” signature.

At Macworld 2002, Steve Jobs unveiled iPhoto as an integral part of Apple’s now-legendary “digital hub” strategy. It was billed as “iTunes for photos”, and it was one of the reasons so many people I know bought a Mac. During that keynote, Jobs noted that one of the reasons Apple was building a photo cataloguing and editing app was that six million digital cameras were sold in the United States in 2001. In the first quarter of 2015, Apple sold 74.5 million iPhones worldwide, which means that they alone sold as many digital cameras in every week of their first quarter as the entire US purchased in 2001.

It’s no surprise, then, that the organizational and editing model set by iPhoto is no longer as effective as it once was. You take your camera and a substantial editing suite everywhere with you, and it’s always connected to the internet, so your photos are always somewhere on a hard drive in the sky. They’re automatically geotagged and timestamped, and your favourites will probably end up on Instagram in a 640 × 640-pixel square box. In short: the way we shoot, edit, store, catalogue, and share our photos has completely changed. The software we use to edit them when we get back to our computer also needs to change.

iCloud Photo Library

Remember the days when you had to physically attach your camera to your computer using such ancient technology as a cable? Remember how you had to go through the arduous process of making sure your photos ended up in the right album while importing them, and manually geotagging them while shovelling coal into your computer to make sure it didn’t die in the middle of this process? Or, at least, that’s what it feels like now.

For a company so at the forefront of the “digital hub”, Apple was very much a laggard for advancements to that model, especially in cloud services. Every year brought new printed product designs — which, admittedly, were gorgeous — and new editing tools, but made it appear as though Apple was content to lag behind their competitors in syncing, storing, and sharing in the cloud. These shortcomings were unfortunately showcased in Apple’s flagship product: the iPhone. Despite each generation of iPhone becoming a way, way better camera with loads of networking capabilities, the easiest way to get photos into iPhoto was to plug it in and hit the import button on your Mac.

Apple’s initial remedy for this was Photo Stream, which was introduced as part of iCloud in 2011. Photo Stream stored your last thousand photos from your iPhone or iPad for up to thirty days and synced those photos between your iOS devices and your Macs, all automatically. What made it extra sweet was that it occupied none of your iCloud storage quota.

But Photo Stream was a decidedly stopgap measure, and it felt especially half-assed on the Mac. In iPhoto and Aperture, Photo Stream appeared as an album, but it had very little actual album functionality. You couldn’t edit photos in Photo Stream, for example — you had to drag your favourites from Photo Stream to a local album to edit them. And you couldn’t manually place the photo back into Photo Stream when you were done, making this whole exercise a little silly. It clearly wasn’t designed to be a cloud photo storage library so much as a way to conveniently view your recent iPhone pictures on your iPad, Mac, or Apple TV. And it didn’t store video files.

What I’ve wanted for a long time is pretty simple: I’d like my library of photos to be stored in the cloud, and I’d like to edit my photos locally and have everything sync up at the end of the day. Why? Scott Forstall nailed it when introducing Time Machine at WWDC 2006:

When I look on my Mac, I find these pictures of my kids that, to me, are absolutely priceless. In fact, I have thousands of these photos. If I were to lose a single one of these photos, it would be awful. But if I were to lose all of these photos because my hard drive died, I’d be devastated. I never, ever want to lose these photos.

So what should I do? What does everyone tell you to do to make sure your photos are all secure, and you don’t lose them? “Back it up”. Right? So everyone says it, you’re all saying it, we all know we should back it up. And I know I should back up, uh, but I don’t.

Lack of children notwithstanding, my photo library is pretty precious to me, and probably to you too. Photos are heroin for our nostalgia receptors, if there are such things. They remind us of specific places and moments. They jog our memory for things we want to remember, and remind us of things we don’t even remember forgetting.

So I do what I’m supposed to do: I back up my photos, along with the rest of my files. I have what is probably a better backup regiment for my photos than most people: not only do I have my Aperture library on a drive that backs up with Time Machine, I also have a Vault set up that backs up to two separate drives. I’m in the minority — apparently, only 10% of users surveyed by Backblaze back up their files daily.

But all three of my backups are in my apartment; if I were serious about backing up, I should be using an offsite backup solution, like Backblaze or CrashPlan. Both of those companies manage their own data centres, and both have pretty great track records of doing so. So would you feel comfortable entrusting your precious memories to a company way bigger than Backblaze and CrashPlan combined? A company that has been in business for nearly fourty years? A company that bragged about its media streaming prowess over ten years ago?

Yes, Apple should be the obvious choice for a company you can trust to keep safe your most precious memories. Yet, despite their apparent solidity, Apple has a spotty track record when it comes to cloud and web services. From incomplete and poor data in Maps to the iTunes errors many of us see daily, Apple’s record isn’t great. And now they’re asking us to entrust our photos to them. Gulp.

I wanted to turn iCloud Photo Library (hereafter: iCPL) on everywhere so I could get the best possible experience.1 Switching it on for my iPhone was easy: I already had the 20 GB iCloud storage upgrade, so the 3-4 GB of photos on my phone fit perfectly in that space, with room to spare. But I shoot RAW files with my Canon, and I have about 60 GB of those in Aperture. So my first order of business was to upgrade my iCloud subscription.

Luckily, Apple is no longer criminally insane, so they now charge reasonable prices for their subscriptions. A 200 GB plan for four dollars per month is a no-brainer.

When launching Photos for the first time, you will be prompted to import your existing photo library. The import process creates hard links to your old photo library and uploads everything to iCloud over HTTPS:

cloudd.5391                                                                                 154 KiB          12 MiB      0 B       0 B     562 KiB
tcp4<->                        en2     Established        3909 B           5227 B      0 B       0 B       0 B    77.97 ms   256 KiB    39 KiB        BE         -     cubic
tcp4<->                        en2     Established        3952 B           4723 B      0 B       0 B       0 B    71.97 ms   256 KiB    39 KiB        BK         -    ledbat
tcp4<->                        en2     Established        135 KiB          146 KiB     0 B       0 B    1398 B    68.78 ms   256 KiB    39 KiB        BK         -    ledbat
tcp4<->                        en2        TimeWait        6772 B            23 KiB     0 B       0 B       0 B    63.41 ms   256 KiB    39 KiB        BE         -     cubic

Uploading all these photos on my home broadband connection took what I imagine is a long time, but I’m not certain exactly how long because it’s completely invisible to the user. It runs in the background on your Mac and on your iPhone, when you’re charging and connected to WiFi. I can’t find any setting that would allow you to do this over LTE, but I’m not sure why you’d want to — photos taken on my 5S are something like 2-3 MB apiece. (I’m aware that this paragraph is going to sound incredibly dated in a few years’ time, for lots of reasons.)

And this is primarily what sets iCPL apart from backup solutions like Backblaze, or other “automatic” photo uploaders like Google+ or Dropbox: it’s automatic and baked in at the system level. Dropbox can’t do that because it can’t stay running in the background, or spawn daemons to do the same. On a Mac, it’s a similar story. Because Power Nap doesn’t have a public API, competing services can only sync while the Mac is awake. iCPL, on the other hand, can take full advantage of being a first-party app with full private API access, so it continues to sync overnight. Nice, right?

As of writing this paragraph, I have 6,149 photos and 12 videos stored in iCloud. Most of these — about 4,000 photos and all videos — are from my iPhone. The rest are RAW files from my Canon XSI. Both work fine in iCPL; it accepts all the popular image file types, and MP4 video files.

While writing this, I realized that I had an archive of approximately 10,000 photos I had to remove from my phone over the past couple of years to free up space, so I’ve started importing those too. During the import of a second batch of photos, I mis-clicked the option to import duplicate photos. It turns out that Photos doesn’t have a post-import duplicate detection tool, which is baffling to me. In a choice between manually finding and removing about a thousand duplicate photos or just leaving them in the cloud, I’ve chosen the latter. I have plenty of storage, and it’s far less frustrating.

All this iCloud storage also means that you can free up some disk space. By default, your device will likely be set to download and keep original photos. If you’d prefer, though, you can choose to allow automatic disk space optimization. This will place all the original files in iCloud, and each device will download only what it needs, on demand.

Be warned, though: getting photos from the cloud as-need in combination with a cellular connection on your phone can lead to some nasty surprises. I was curious as to whether Photos on iOS would download an original RAW file, or whether it would grab an optimized JPG version. Not only did it grab the RAW version of a 13.2 MB photo, it also downloaded what I can only assume are a couple of buffer files on either side of the selected photo, all of which happen to be RAW files, in this case. Total bill for downloading one photo: slightly over 50 MB of my 1 GB monthly bandwidth allotment. Eek.

Now that I’ve got all my photos on a hard drive in the sky, I should create something.


So you’ve spent a day out and about, shooting a bunch of photos on your digital SLR and your iPhone in a bunch of different locations. You get home and you want to import, sort, edit, and share these photos. Pretty standard workflow, right? Let’s get started.

For photos taken on the iPhone, the importing is obviously taken care of automatically via iCloud. Importing photos from your SLR is done the old fashioned way, by either connecting your camera via USB, or by ejecting the SD card and plugging it into your Mac. When you do, a new tab — Import — will appear alongside the existing Photos, Shared, Albums, and Projects tabs.

Importing photos couldn’t be simpler. Across the top is a filmstrip of photos you already have in your library; below it are new photos. Select the ones you want, or just click “Import All New Photos”. I’ve found that importing is just as fast, if not faster, than Aperture. Thumbnails build quickly and scrolling is far, far smoother and faster than either of its predecessors.

The Album sorting paradigm still exists in Photos but it’s decidedly subdued. Like on iOS, the default view separates photos automatically based on date, time, and location. Faces are also available as a categorization method, but it’s also not as pronounced as it was in iPhoto.

For being a primary factor in the way photos are grouped, locations seem to get the least amount of love in the app. In order to have a location assigned to a photo, it must have been taken on a camera that automatically adds location data; there is no global map view, and no way to manually assign a location to photos. However, if you do this kind of mixed import with photos taken on the same day and in the same timeframe, Photos will assume that these photos were probably taken in the same location, and place them alongside each other in your collection.

Sorting through your imported photos to find your favourites is even simpler than it was in iPhoto and Aperture, and by “simpler”, I also mean “less capable”. You can tap the heart-shaped button to mark a photo as a favourite, and that’s it. There are no star-based ratings, nor is there a two-up view to compare between similar shots and pick your best.

As you dive deeper into Photos, you’ll notice a pattern beginning to emerge: it does basically the same stuff as iPhoto, but in a far nicer way, and is no match for Aperture. It’s also far more than simply a scaled-up version of its iOS namesake.

By default, the enabled palette of editing tools consists of the Colour, Light, and Black and White editors. Like their iOS counterparts, these three simple-looking sliders are comprised of multiple adjustments that are continuously assessed and modified on a per-image basis. Cranking up the Light adjustment on one image, for example, might significantly increase the exposure while reducing highlight brightness. Doing the same on a different image might instead reduce the exposure while cranking up the shadow brightness. Clicking the small disclosure arrow beside each of these adjustments will reveal all of the subset tools, so you can further tweak each aspect.

But there are far more tools bundled into the Mac version of Photos than its iOS counterparts. Clicking the “list” icon in the top-right of the adjustment palette will reveal a vastly broader range of tools, from levels to white balancing, to sharpness. This is as much an ode to simplicity as it is to needless difficulty. While I understand burying a fairly complex tool like levels from most end-users, most people would probably be comfortable with white balance and sharpness. There are some notable omissions here, too: there is no curves tool, for example, or adjustments for fine-tuning RAW files after import. Even some tools that are in the app, such as the magenta/green tint, are buried within other tools — in this case, the white balance tool. This depth means that an adjustment previously requiring one click now takes a couple more.

The tools that do exist are of a very high calibre. As I mentioned above, the three standard tools are complex and nuanced, adjusting multiple parameters constantly to create a great image. On a RAW file with good exposure, the black and white adjustment doesn’t leave a bunch of blocky noise everywhere.

Unfortunately, unlike in Aperture, these adjustments cannot be layered. You cannot, for example, have two instances of levels. You get one, and you’ll be happy to have it.

Most impressive of all is that these adjustments sync over the air to your iOS device in a non-destructive manner. You can tweak the colour on your Mac, then grab your iPhone and use the same tool. But this ability is limited to Colour, Light, and Black and White, filters, and cropping; additional adjustments are not editable on iOS, and any photo with adjustments beyond this set will appear as “flattened”. That is, if you apply a filter and you also adjust the white balance, you won’t even be able to change the filter.

I’ve found that syncing edits between devices is pretty instantaneous. Usually, by the time I unlock my phone and launch Photos, the edits have synced. Occasionally I’ll run into an issue where I’ll make an edit on my Mac, then view it on my iPhone, then make another edit on my Mac, and the phone copy won’t update. The thumbnail usually will, but the full-size image will be cached. I’ve found that I can usually work around this by force-quitting Photos on my phone, making a small tweak on my Mac, and relaunching Photos after a minute or so. The small tweak will help trigger a re-sync, and the minute will give everything plenty of time to catch up.

There’s one more tool available on the Mac that isn’t to be found in the iOS versions of Photos: a retouch tool. I’ve used a lot of different photo retouching tools, including those from iPhoto, Aperture, various versions of Photoshop, and various iOS apps. I must say that the one in Photos is easily one of the best I’ve ever used. Even in automatic mode, where it guesses the best source area, I’ve found it to be consistently great at matching tones, colours, and textures. This is a hard feature to get right; even the one in Photoshop is often flummoxed. But the one in Photos seems to get it right more often than not. I’m smitten.

There are also the typical tools you’d expect. There’s auto-enhance, which I never, ever use, but I tested it for the purposes of this review and I found that it does, indeed, work. There’s also a cropping tool, which annoys me because it resets the aspect ratio every time you select a photo. So if you want to crop a series of photos to 3:2, you have to manually select it each time.

So, you’ve picked your favourite photos from those you imported earlier, and you’ve edited and cropped them. Now it’s time to share them. Syncing them to your iOS devices is a piece of cake — they’re already there.2 Sharing to your social networks is also easy: it uses OS X’s share sheets, naturally. Sharing to disk, however, is a little bit hidden. It doesn’t appear in the contextual menu, nor anywhere in the apparent UI, but it’s there, under File → Export. You can also create the usual plethora of calendars, greeting cards, and books.3 I didn’t test this beyond creating one, but it’s pretty much what you’d expect if you’ve ever created a printed photo product with Apple before.

Photos gets most of the basics right, and I do like what it does. Yet I can’t help shake the pervasive feeling that this is no Aperture replacement. It’s clearly designed for the way in which most people take most of their photos these days, and that’s fine. If this were solely a replacement for iPhoto, it would be spectacular. But as Aperture was discontinued at the same time, this feels like a product that must fill the shoes of both of its predecessors. I have no doubt that, over time, myriad plugins and extensions will be created for it that make it far closer to Aperture, should you so choose. Apple may allow multiple iterations of the same adjustment tool to be used at once, and they may add features like two-up viewing and duplicate detection, both of which are pretty much essential.

For me, there’s no shaking the fact that this doesn’t feel like Aperture. There was something about editing a photo in that environment that felt like you were creating something really special. It was the kind of application you could get lost in. Photos doesn’t feel like that. It’s not the all-Yosemite UI, I don’t think, nor is it any particular addition or removal of features. It’s just, somehow, not quite as engaging, immersive, or just plain fun.

I like Photos, but I don’t love it. Yet.

  1. True to form, my iCloud Photo Library experience got off to a rocky start. Photos simply wouldn’t sync, so I had 1,600 memories stuck in the void. Furthermore, the Photos “app” on the iCloud web service never appeared for me, even after iOS 8.1 was released. I tried all manner of toggling, restarting, and resyncing. By some fluke, I managed to make the web app appear, but it was stuck in a preparation mode. Manually triggering a sync would throw a “sync will resume when this iPhone has restored from iCloud backup” message, despite never having used iCloud for my phone backup purposes.

    I filed a radar on this, and mentioned its number to anyone who would listen. And it worked. I got a call from a nice person on the iCloud team, who reset some caches on their end and did some other wizardry. This worked on my Mac and iOS devices, but not on the web. It wasn’t until a few weeks later that the latter was debugged.

    This is normally the sort of thing I could debug on my end, but as iCloud has been designed as a black box of magic, I couldn’t. When it works, it is magical, but the lack of transparency makes debugging virtually impossible. I’m not sure what kind of verbosity developers see in the console when writing their own iCloud-enabled apps, but if it’s anything like what I’ve seen, I understand their frustration. ↥︎

  2. Some data from Photos, like Faces, isn’t very visible on iOS, but it does sync. You can reveal it by doing a search for someone whose face you’ve tagged. ↥︎

  3. Based on some early chatter I heard, I was under the impression that printed products were being dropped in Photos. I was wrong, and I regret the error, but I’m delighted that these products are still around. ↥︎

They definitely do what they’re supposed to do: make the Watch look effortless, fast, and of light cognitive weight. Also, the score sounds a little Trent Reznor-y, so that doesn’t hurt.

Some things I noticed while reading through this:

  1. Everything is really, really small. This should be obvious to anyone who’s tried a Watch or caught a glimpse at just how tiny they really are, but it’s worth saying again: the amount of miniaturization Apple has done here is staggering.

  2. Everything is typeset in San Francisco, from the instructional pamphlet, to the warning text on the battery, to the serial number on the back of the Taptic Engine.

  3. iFixit has generally not been kind to Apple in terms of their repairability scores, but they gave this a 5/10. I guess they’re letting a lot of stuff slide since this is a crazy-miniaturized wearable product.

  4. This is arguably one of the more interesting teardowns iFixit has done. Most smartphones are kinda similar inside; this is a radically different product. And, I must say again, it is tiny.

  5. The as-of-now nonfunctional oxygen sensor is intriguing.

That’s because it will look like a Moto 360. Alex Dobie, Android Central:

That’s right, it sure looks like the long-rumored circular Samsung Gear watch might finally come to fruition. The move to a rounded watch face would explain the need for Samsung to open up the SDK to developers ahead of launch day, as the move away from a rectangular face would be a significant change for Samsung’s seventh smartwatch.

The weather app they’ve got onscreen sure does look a lot like the weather app on the Apple Watch, though.

Setting aside petty stuff like that, it’s pretty clear that Abdel Ibrahim’s prediction is coming true. Even when it’s off, the Apple Watch is unique amongst the current generation of smartwatches in its shape alone. Google’s also using circular screenshots in their Android Wear feature announcements, which continues to reveal just how bad a circular display is for actually, you know, displaying stuff.

Jony Ive, as quoted by Scarlett Kilcooley-O’Halloran, for Vogue:

What we’ve done fairly consistently is try to invest tremendous care in the development of our products. It’s not so much about things being touched personally – there are many ways to craft something. It’s easy to assume that just because you make something in small volumes, not using many tools, that there is integrity and care – that is a false assumption.

This is a fascinating concept: craftsmanship in a mass production environment, the scale of which is unprecedented. It’s one thing to precisely cut grooves in a tiny dial once, or make a single display that’s laminated to a piece of sapphire. It’s also one thing to make a million crappy plastic products. It’s a whole different story to try to execute goldsmith-like craftsmanship millions of times over. That’s something Apple is entirely unparalleled at.

Alex Sherman, Bloomberg:

This week, U.S. Federal Communications Commission staff joined lawyers at the Justice Department in opposing the planned transaction. FCC officials told the two biggest U.S. cable companies on Wednesday that they are leaning toward concluding the merger doesn’t help consumers, a person with knowledge of the matter said.

Bloomberg’s source says that the decision should be announced tomorrow, but this is Comcast, so it could be next Tuesday for all they care.

With a supply chain as enormous and lengthy as Apple’s, there’s only so much they can anticipate in advance, especially for a product with so many variations. They can research how many people will prefer the Milanese loop over the bracelet, but it’s nothing more than an educated guess. Hence, preorders: Apple’s chance to assess where the actual demand lies and adjusting supply to match.

Bloomberg’s Gordy Megroz profiled Dave Asprey in advance of the launch of Asprey’s Bulletproof Café in Santa Monica in a report that’s absolutely appalling in its skepticism, or lack thereof. For the uninitiated:

[O]f all his out-there health claims, it’s the coffee he’s drinking—blended with butter made with milk from grass-fed cows and a medium-chain triglyceride (MCT) oil derived from coconut oil—that’s making Asprey most famous.

He calls the mixture Bulletproof coffee. Drink it, the name implies, and you’ll feel invincible. “Fats and caffeine help stimulate the brain,” Asprey says in his office, taking another sip. The coffee, along with the drug cocktail he’s just downed, which includes vitamins K and C as well as aniracetam, a pharmaceutical designed to improve brain function, is intended to provide hours of enlightenment. “There’s a sense of cognitive ease, where everything you want to say is at the tip of your tongue,” he says. “It’s like getting a new computer—you never want to go back to the old one.”

It sounds great. It sounds magical. It sounds citation-free. It smells a bit like bullshit:

As far as MCT oil improving brain function, that’s not a call that can be made yet (sorry Bulletproof). There was a study that used MCT oil to treat people with Type 1 Diabetes and another that used it for Alzheimer’s patients, and both studies found that MCT oil helped to repair some cognitive function. BUT (and it’s a big but), we cannot extrapolate the results from subjects with significant cognitive impairment and pretend to know the impact on subjects with normal cognitive function. It would be nice, but that’s just not how biology works.

Is it possible? Yes, it’s possible, but it’s far from proven. Indeed MCT oil is very controversial in the nutritional community.

Let’s keep going with the Bloomberg story:

A 12-ounce bag of Bulletproof coffee sells for $18.95, more than twice the price of a bag of Starbucks. A small cup will cost $4.25. “Our coffee goes through extensive lab testing to make sure it doesn’t contain toxins,” Asprey says. “You’re paying for quality—something that won’t make you feel bad.”

That’s bullshit, too. Pretty much all coffee is washed before roasting, so there are practically no mycotoxins left on the beans.

This article is about 2,400 words long, but just three paragraphs contain any response from health professionals. It’s mostly bunk, and Megroz bought right into it.

Fascinating new service from Google, as explained by VP Nick Fox:

We developed new technology that gives you better coverage by intelligently connecting you to the fastest available network at your location whether it’s Wi-Fi or one of our two partner LTE networks. As you go about your day, Project Fi automatically connects you to more than a million free, open Wi-Fi hotspots we’ve verified as fast and reliable. Once you’re connected, we help secure your data through encryption. When you’re not on Wi-Fi, we move you between whichever of our partner networks is delivering the fastest speed, so you get 4G LTE in more places.

This is really clever. If the pay-what-you-use pricing model puts the pressure on carriers the way Google Fiber did on ISPs, this should be good for everyone, not just Fi users. Pity we’ll likely never get this in Canada, though; we desperately need a shake-up.

Juli Clover, Macrumors:

Apple today began sending out emails to iOS developers, offering them a chance to purchase a 42mm Apple Watch Sport with a Blue Sport Band that has a guaranteed shipment date of April 28, 2015, in order to get them a device to begin testing apps on. Quantities of the watch are limited, and developers eligible to purchase a watch will be chosen by random selection.

You read that right: Apple has been so swamped with orders for the Watch that they are emailing random iOS developers offering them a chance to purchase a development unit that will ship only shortly after the first batch arrives on customers’ wrists. That’s two layers of lottery these developers must win in order to acquire one of these.