Given the range of talent that Apple has been bringing in to work on its iWatch project over the last year, the iWatch is quite possibly the reason for Jeff Williams, Apple’s COO, and four colleagues to meet with the US’ healthcare regulator in December, ahead of an expected launch later this year.
But it’s not the only possible reason. Nor, indeed, is it Apple’s first meeting with the FDA.
Good reporting from Bradshaw.
But, while recent FDA meetings might not be directly indicative of a new product, that doesn’t rule out a product which may require FDA approval. Just look at this hell of a scoop from Mark Gurman of 9to5Mac:
Apple plans for iOS 8 to include an application codenamed “Healthbook.” The software will be capable of monitoring and storing fitness statistics such as steps taken, calories burned, and miles walked. Furthermore, the app will have the ability to manage and track weight loss.
The problem isn’t that they botched it (although they did, in some ways). The problem is that Microsoft isn’t Apple, and Microsoft’s customers aren’t Apple’s customers. They tried selling a more Apple-like attitude to their customers, most of whom don’t want and won’t tolerate an Apple-like attitude. That’s why they’re not Apple customers.
Microsoft’s customers have always demanded, and will always get, exactly what they ask for. That’s the reality of serving the low- to midrange-PC business, and it’s sure as hell the reality of the enterprise business.
I disagree: Microsoft didn’t take an Apple-like approach with Windows 8. The Apple-like approach would have been to position and market the tile interface portion of the OS as an entirely different product for use exclusively with tablets. What they actually did was a much more Microsoftian approach: they crammed everything together into one big, bloated mess. It’s the “Windows everywhere” approach that has long been part of Microsoft’s culture.
Google will keep the Nest group intact inside the company. The new division will still work on hardware devices, but not necessarily thermostats or smoke detectors. In fact, Google would like [Tony] Fadell to work on gadgets that make more sense for the company. Will it be a phone or a tablet? It’s unclear for now.
While Nest first became popular with its thermostats, Google didn’t buy the company for these devices. First and foremost, the company wanted to snatch the great product team.
Completely unsurprising, but it makes sense: Nest simply has a better product design team than Motorola, and Google only needs one of them.
Microsoft is once again planning to alter the way its Start Screen works in Windows 8.1 Update 1. While the software giant originally released Windows 8.1 last year with an option to bypass the “Metro” interface at boot, sources familiar with Microsoft’s plans have revealed to The Verge that the upcoming update for Windows 8.1 will enable this by default. Like many other changes in Update 1, we’re told the reason for the reversal is to improve the OS for keyboard and mouse users.
For once, I disagree with the Sweet Setup. Duncan Davidson:
While Aperture and Lightroom are positioned as competitors, they have distinctly different strengths. Aperture’s strength is in its user interface and its organizational tools — it is the best tool for the long-term process of building out collections of finished photographs. However, when it comes to taking a pile of images, selecting the best of them, and then making those look awesome quickly, Lightroom is the better application.
I’ve used both extensively, and I found Lightroom to have a cumbersome interface and a strange workflow. I also found that its tools, while more robust than Aperture’s, could not offset these issues. Couple that with Aperture’s library and super easy “Vault” backup features and I find Aperture to be the better of the two.
But would I recommend it over Lightroom? Probably not. Apple tends to struggle with keeping their pro apps updated. In addition, Duncan Davidson is a photographer, while I am simply some dude on the internet that takes pictures. You should probably take his advice and use Lightroom, but I’ll probably stick with Aperture.
A touch input metaphor and a pixel input metaphor not only should be, but MUST be, wholly different and wholly incompatible with one another. It’s not just that they do not comfortably co-exist within one form factor, it’s also that they do not comfortably co-exist within our minds eye.
In plain words, it’s no accident that the operating systems for tablets and notebooks are distinctly different from one another. On the contrary, their differences — their incompatibilities — are the essence of what makes them what they are.
I’ve been arguing that for a while now. The idea of having one GUI which works for both touch- and keyboard-based input is mired in so many complications that it’s implausible to succeed. However, I could imagine an operating system which changes its GUI to match the active input method.
Google announced that they were buying Motorola on August 15, 2011; the acquisition was confirmed on May 22, 2012, for a final total of $12.5 billion.
Larry Page, today, 617 days later:
We’ve just signed an agreement to sell Motorola to Lenovo for $2.91 billion.
This acquisition cost Google $15.5 million for every day that they’ve owned Motorola. That doesn’t include the cost of rebranding, or the $500 million Google spent marketing the Moto X. (Mind you, it also doesn’t include Google’s sales of various other entities since they bought Motorola.)
Why would Google lose such an enormous lump of money?
Google will retain the vast majority of Motorola’s patents, which we will continue to use to defend the entire Android ecosystem.
Josh Bryant (@jb nearly everywhere) was also subject to a similar attack:
The scary thing was that I only thought of the true implications of this attack days later. As I was contemplating what had happened and how I could prevent it in the future, a very frightening thought occurred to me. This attacker started with Amazon because he knew that an commerce shopping site’s customer support would be relatively easy to convince and gain access. However, that same site offers cloud services that many startups (including mine) rely on to host their data. Droplr, the startup that I am a founder of, is completely based on Amazon’s stack, from using EC2 servers where we host all of our technology to S3 which we use for file storage. This attacker had access to all of it. I was extremely lucky that in his rush to gain access to @jb, he didn’t think to check if my account had anything under AWS.
I had a rare Twitter username, @N. Yep, just one letter. I’ve been offered as much as $50,000 for it. People have tried to steal it. Password reset instructions are a regular sight in my email inbox. As of today, I no longer control @N. I was extorted into giving it up.
It’s been a year and a half since Mat Honan was hacked, and it’s striking just how little has changed in that time.
Svbtle is an online publishing platform: Like WordPress.com and Blogger and unlike Medium, it’s decentralized, giving writers the option to have their own domain and logo, and though style customizations are limited to colors and avatars, Curtis explains in today’s announcement that more options are to be presented soon.
There are more customization options available on Svbtle, but there is a distinct “Svbtle style” — there’s a Svbtle theme for WordPress, after all. Svbtle also runs a curated “Magazine” selection. I think it’s perfectly reasonable to compare the two, much in the same way as it’s reasonable to compare them both to starting a self-hosted WordPress blog: they’re all tools for putting your writing online.
What “The Typist” articulates well is what Medium does especially differently. In a tweet to me earlier today, he pointed out that:
Medium essentially outsources content in return [for] the exposure jackpot. But writing there isn’t any different than guest-blogging
I think that’s what makes Medium unique and, perhaps, sleazier. Maybe some of the $25 million they raised today will go towards paying writers.
For me, what Beats Music is doing with curated playlists and recommendations is immensely better than a top chart or an anonymous recommendation. Because there are people handling curation behind the scenes, you’re told why a playlists contains certain songs over others. I was watching the Grammys live the other night, and a few minutes after Macklemore won Best New Artist, Beats Music tweeted the “Macklemore vs. Kendrick Lamar” playlist during a commercial break.
Beats Music is hugely impressive. I feel that it’s the first real attempt at creating a streaming music service that moves beyond the established context of a massive library, or a recommendation-based radio station. It’s music done right, and I hope it succeeds.
Dustin Curtis has opened signups for his blogging platform:
Svbtle is designed to highlight the things that matter; it’s an extremely simple platform for collecting and developing ideas, sharing them with the world, and reading them. That’s it. We’ve focused all of our energy into designing the simplest interface possible for accomplishing these goals. Svbtle is blogging with everything else stripped away.
Svbtle and Medium are extraordinarily similar, and I’m not sure I understand either of them. Are they user-generated publications? Are they like WordPress.com (as opposed to the .org package) but with a cooler interface?
The second bit of strange news isn’t really that strange. At least, it shouldn’t be. MG Siegler:
You see, because of the downward trend of both the Mac and the iPod (in terms of revenue), the iPhone is becoming increasingly important to Apple’s bottom line. We’ve known for a while that Apple “lives or dies” each quarter based upon how many iPhones they sell, but it’s never been as reliant on that one device as it is right now.
iPhone sales now represent nearly 60% of Apple’s entire revenue stream. Meanwhile — and this is the part I find hard to comprehend — the iPod represents about 2%. From the company that was once “The iPod Company” to a company for which iPod sales now approach a rounding error.
Which brings me to Siegler’s other big point:
[I]t’s hard to envision another product line that can match the money-making ability of the iPhone. Remember that the iPhone is unique in that it’s a very expensive device that has been heavily subsidized in the U.S. by the mobile carriers. This is part of the reason why it struggles to sell as well in other countries — it’s simply much more expensive in other markets.
It’s also unique in that it’s a product which is inherently convergent, and therefore arguably more essential than any other product line in the technology market today. Tech companies aren’t having a hard time coming up with new product lines because of any other reason than they’re trying to force it. Smart watches feel forced; internet-connected televisions feel forced; other wearable stuff does as well. The humanity of the smartphone is what made it so wildly successful as a category.
First bit of odd news out of today’s earnings report:
When asked why the iPhone 5c represented a smaller mix of total handset sales than Apple expected, Cook said he believes customers were simply drawn to the flagship iPhone 5s.
“I think the 5s, people are really intrigued with Touch ID,” Cook said. “It’s a major feature that has excited people. And I think that associated with the other things that are unique to the 5s, got the 5s to have a significant amount more attention and a higher mix of sales.”
I’d wager that the 5C sold a greater share this year than the 4S did last year when it occupied a similar purpose in Apple’s lineup. The difference is that this year’s $99 model was positioned as a new model.
Anecdotally, I see a lot of 5Cs around Calgary — a lot.
I’m reminded of a recent Radiolab episode about Ötzi, the 5000 year old Iceman discovered in the Alps in 1991. For more than two decades, Archaeologists have poured over the details of his clothing, his possessions, his tattoos, and the arrowhead lodged in his back, evidence he was murdered. From the contents of his stomach, they’ve even determined what he ate for his final meal. I wonder if there will someday be archaeologists who sift through our hard drives, tracing out the many digital trails we’ve left in the web, trying to determine not what we were eating, but what we were thinking. Will their findings be accurate?
President Obama announced new restrictions this month to better protect the privacy of ordinary Americans and foreigners from government surveillance, including limits on how the N.S.A. can view “metadata” of Americans’ phone calls — the routing information, time stamps and other data associated with calls. But he did not address the avalanche of information that the intelligence agencies get from leaky apps and other smartphone functions.
And while he expressed concern about advertising companies that collect information on people to send tailored ads to their mobile phones, he offered no hint that American spies routinely seize that data. Nothing in the secret reports indicates that the companies cooperate with the spy agencies to share the information; the topic is not addressed.
Yours truly, less than two months before the first Snowden documents were published (PDF link):
The key to understanding Snapchat’s appeal is seeing how radical and new the Friendster/Facebook/G+ version of identity is.
It got slipped into our lives as something normal and natural. But it is not.
This concept of privacy is a relatively recent construct, but is it wise to throw it away? Would we be happier, better, or more productive if we sacrificed much of our ability to dictate what information we release more-or-less publicly?
“We’re really bad at looking back in time,” [Keith] Hampton said, speaking of his fellow sociologists. “You overly idealize the past. It happens today when we talk about technology. We say: ‘Oh, technology, making us isolated. We’re disengaged.’ Compared to what? You know, this kind of idealized notion of what community and social interactions were like.” He crudely summarized his former M.I.T. colleague Sherry Turkle’s book “Alone Together.” “She said: ‘You know, today, people standing at a train station, they’re all talking on their cellphones. Public spaces aren’t communal anymore. No one interacts in public spaces.’ I’m like: ‘How do you know that? We don’t know that. Compared to what? Like, three years ago?’ ”
The Macintosh on the other hand was a revolution. Yes, it took the best bits of many people’s other ideas, just as the Model T Ford and Spitfire did, but it was the first home consumer or small office computer with a graphical user interface or GUI and that had to be the way forward, unless you were a cretin. Your screen was a white representation of a virtual desk, office icons and wastepaper basket (or trashcan if you prefer) included. There were folders, windows, pull down menus, all of which could be operated and manipulated, not by keyboard commands but by this mystical magical mouse, a rolling pointing clicking device that completely altered the way you related to everything you did on your computer.
In 2003, my twelve-year-old self sat with my dad as he was buying his first digital camera. I distinctly remember glancing over and seeing the then-new iMac G4 — the “sunflower” model, with its crazy suspended display — and being mesmerized. This, I thought, this would be my first Mac.
It was not my first Mac.
In 2005, I was at the same camera store shopping for my first digital camera, and I had the chance to play with a Power Mac G5 hooked up to a 30” Cinema Display. This, I thought, I must have one of these displays on my desk.
It also was not my first Mac.
In 2006, Microsoft unleashed Windows Vista on the world. Our family PC was old enough that it was incapable of running Vista; we’d have to buy a new computer if we wanted to upgrade. Since my parents would be buying a new computer anyway, they decided to look into switching to a Mac. They reasoned that the frequent blue screens we were experiencing combined with the need for antivirus software and the annual defragment and reinstall of Windows was more maintenance than they should be putting into a computer. They ended up with a late 2006 17” iMac, and were extremely happy.
But while I used it, it was not my first Mac.
While I had a job and it was “my” money, my parents taught me about budgeting and expenditures: it was required that any of my purchase must leave at least an equal amount remaining in my account. Therefore, I had been madly saving up at least $4,000 to justify the cost of the $2,000 MacBook Pro that Apple was certainly going to update at WWDC. They did, and I bought one: a 15” mid-2007 model with a 2.2 GHz Core 2 Duo processor, a 120 GB (!) hard drive, and 2 GB of RAM.
I remember it being delivered, and opening the giant box to see what, then, was a svelte slice of aluminum sitting in its protective styrofoam. But my clearest memory was thinking of it as a tool, and as an investment in my future.
That was my first Macintosh.
It’s been upgraded since I bought it — the logic board was replaced under warranty, so it’s now a 2.4 GHz model, and I personally upgraded the hard drive to a 750 GB unit and upped the RAM to 4 GB. I used it every day until August 2012, and I continue to use it regularly as a server, and to power installations as necessary. It’s turning seven this year, but it’s part of a remarkable thirty-year lineage.