Monday, December 31, 2007

The Future of DVDs



There's an interesting article on Seeking Alpha today about Apple and the future of the DVD player. Unfortunately, it is based on some extremely poor assumptions; in particular, they wrongly assert that it's cheaper to distribute physical DVDs, or HD-DVDs, than bits.

Um... no. Not even close.

It may be slower (downloading a 5GB DVD will take you a little over five and a half hours at 2Mbps) but that's not comparing apples to apples. Apple is selling near-DVD quality video that takes about 20% of the space of a DVD. They could probably sell you DVD-or-better quality video in about 30% of the space.

Yes, an iTunes movie doesn't include the director's commentary and other crap*, but it doesn't force you to watch ads, redundant anti-piracy messages, and you can actually jump to any spot in a movie effortlessly. And a director's commentary is just an audio track anyway.

How much does 1.5GB cost Apple? A darn sight less than the cost of a physical DVD, packaging, inventory, distribution, anti-theft devices, etc. etc. etc. Heck it costs the consumer almost nothing (one 480th of the month's broadband bill) and that's retail. If Apple is paying $0.10 to serve 1.5GB I'd be shocked, and that's way cheaper than the wholesale cost of DVD replication. And the marginal cost (if you already have broadband) is zero. To put it another way, that's a lot less than you're paying for shipping (even if it's "free"). And Apple doesn't end up with random quantities of unsold inventory that need to be discounted or turned into landfill.

I think that the writer confusing cost with bandwidth (a semi-trailer load of DVDs represents a ridiculous amount of bandwidth).

Hulu



NBC and a number of partners are giving their content away free online. (More evidence that the idea that physical distribution is cheaper than network is ridiculous.) You can watch very high quality video of new TV shows with fewer ads on hulu.com than on NBC itself. Admittedly, this is part of NBC's bizarre** "anything but Apple" distribution strategy... It's simply implausible that NBC will make more money off a TV show by giving it away free with 2.5 minutes of unoptimized ads than by taking a cut of the $2 Apple would sell it for, but it must be a cheap enough strategy that NBC isn't bleeding money out its eyes testing it.

DVD Player Sales Down. Widescreen TV Sales Up.



Also today I read an article about the impending demise of Circuit City. The article discussed consumer electronics sales trends, and the most interesting one to me was that TV sales overall are down (flat panels up, projection and CRTs down) as are sales of DVD players. Computer sales (in dollars) are going down with prices (not volume).

Make no mistake, the US TV market is supersaturated. Most homes have more TVs than they know what to do with, so most TV sales are pure replacement/upgrades. It seems to me that people want bigger, better displays, but they aren't buying devices to push content through them (or perhaps they are -- in the shape of XBox 360s and PCs).

* I used to love directors' commentaries, but they seem to have stopped being about "how we ended up telling the story this way" and instead become a bunch of pointless trivia about who did the work behind this obscure aspect of the scene padded out with the usual attaboy backslapping everyone in Hollywood is a genius marketing BS. In essence, I think that because movie-making has become so demystified (in large part because of directors' commentaries) the proportion of interesting information in directors' commentaries has started to approach zero. You still need to listen to Big Trouble In Little China's commentary.

** OK it's not that bizarre. NBC has been in Microsoft's pocket for a long time now, and the original venture with the iTunes Music Store was probably an aberration driven by fear of being left out. Since iTMS isn't making bajillions of dollars, NBC can now safely spurn it... Who knows, maybe giving away content paid for by untargeted and repetitive ads really will pay dividends. I know I'm buying a ton of Cisco routers having seen the same damn Cisco ad 50x on Hulu.

Thursday, December 27, 2007

It's the end of the (TV) world as we know it, and I feel fine



For ten years, I've been saying to friends and anyone else who is sufficiently immune to boredom that network TV was headed the way of commercial radio, and predicting it would be all over in five years. I was obviously at least five years ahead of myself. The proverbial fat lady hasn't sung, but I believe that the television writers' strike will be looked back on as marking the beginning of the end, a watershed moment.

Network (and indeed "standard cable") television is dead to me, and I suspect many others. At its last gasp, it produced some of its best work ever -- amazingly original television shows like Lost (season one, at least), and brilliantly executed retreads of old concepts such as Monk and Battlestar Galactica. Many television critics observed that television was entering a golden era, just as its audience began to disappear. And probably for just that reason. As network executives flailed about trying to figure out how to get ratings, it must have occurred to some that perhaps actual good writing might do the trick, and for a remarkable number of shows, it did.

But, just as a gas flame burns brightest when its supply is cut off, television has flamed out. The brilliant new shows have exploded as their hail mary concepts have been forced back into the tired "it must last five seasons of twenty-two episodes" formulae, and the writers and producers have leapt at each others' throats in some kind of insane suicide pact. It's worth noting that two of the best recent shows -- Lost and Galactica -- both nearly lost it with padded episodes intended to stretch their arcs to five seasons, but then made mid-course corrections and then cut their arcs to four seasons.

The underlying problem is completely obvious and perfectly analogous to the music downloading issue.

First of all, people want to watch good shows, and they prefer not to see ad breaks. They're actually quite happy to pay for the shows without the ads -- witness the huge numbers of TV shows selling on DVD. And they're going to turn out to be quite happy to see "TV" shows which never get shown on TV (or perhaps only appear on TV as reruns -- the way movies do).

Second, commercial television is an extremely inefficient mechanism for delivering content that people want to the people who want the content, just as it is an extremely inefficient mechanism for delivering relevant ads. Guess what? The exact same issues apply to commercial radio, radio advertising, and music. Premium cable is even worse, incidentally. It's cheaper and more satisfying to watch the Sopranos, say, all at once on DVD than be strung out waiting for it to show in installments as HBO tries to tease out subscriptions.

How is this playing out?

Well, folks are buying the stuff they want direct from the source (or, as direct as they can), and ignoring the middleman. In this case it means buying movies and TV shows on DVD or from iTunes and ignoring commercial TV ... in ever greater numbers. And the writers'/producers' mutually assured destruction is just hastening the end.

Futurama has just gotten a new "season" purely driven by DVD sales.

I'm surprised that Stargate SG-1 didn't continue as a pure DVD/iTunes venture, but I suspect that was more a consequence of writers' fatigue than economics. Don't be surprised if, say, Firefly comes back as a DVD/download-first, see it repeated on TV later, production. The economics aren't incredibly difficult to calculate. Suppose that a studio gets 50% of iTunes sales, a show like LOST might cost $3M per episode to make and (currently) only make $250,000 from iTunes sales alone. But imagine if LOST were produced a bit leaner (e.g. it were made in Canada or Australia), had a rabid fanbase (OK, that doesn't stretch your imagination much), and were coming out on iTunes exclusively one week before it was shown on TV.

This is speculation, but it seems pretty inevitable to me.

I'm currently watching seasons one to three of The Wire on DVD. This is a brilliant show that could never be shown on US commercial TV, doesn't fit the network formats at all (1h episodes, profanity, nudity, violence, and 12 episodes per season).

The commercial television networks don't have a big fork in their asses just yet, but their asses are probably tingling in anticipation.

Mortgage Brokers



Most if not all of my tiny readership will be surprised to know that retail financial services are one of my pet peeves. I worked in the industry for a couple of years and have a pretty good idea of how it works, and my wife does research in the area of persuasion (like most of her colleagues and her mentor she comes at it from the angle of having been -- like most of us -- more often a victim than a perpetrator).

Anyways, according to the foremost expert on the subject (see link above), there are six basic principles of persuasion:


  • Reciprocity: "even if you don't like it, you can keep the steak knives as our gift to you".

  • Commitment and Consistency: "either you're for us, or you're against us"

  • Social Proof: "are you missing the highest rated drama to come out this Fall?"

  • Authority: "nine out of ten dentists agree"

  • Liking: ok this is so obvious it doesn't need an example

  • Scarcity: "this is a strictly limited offer, limit of five per customer"



Scarcity is listed last because, for some reason, it's the last one in the Wikipedia summary and I'm too lazy to look up the order Dr. Cialdini chooses, but manufactured scarcity is perhaps the most overused tactic because it's cheap (usually free) and extremely effective. This is why folks will buy stuff on sale that they don't have any conceivable use for.

Anyway, the reason I dredge up this sordid topic is that -- like many other Californian mortgage owners -- I've been looking at refinancing a home loan. Homeloans are, at heart, a very simple, low risk, high margin product. You buy some real estate worth X, but you have some amount Y, much less than X, you're able to pay. You borrow the difference (plus padding, P) X - Y + P, at an interest rate I (which is greater than the "dead safe investment" rate for, say, government bonds by a profit margin and risk assessment), and you offer the real estate as collateral.

So the basic principle is that if you can't pay, the lender gets the real estate which is virtually certain to be worth a lot more than the missing money, because real estate doesn't drop in value ... right?

Out of Whack



Unfortunately, homeloans have become a LOT more complicated because all the underlying assumptions prevented a lot of people from buying houses, especially in really insane places like California where house prices are completely out of whack.

How can house prices be "completely out of whack"? Surely the US is a "free market" country and the market must -- by definition -- be correctly gaging (I just discovered that Americans can't spell "gauging" either) the value of houses.

Well here's the way you can tell if property values are completely out of whack. If the rental income from a property is nowhere near the cost of renting the same amount of cash (i.e. a mortgage for the whole darn thing) then the property value is completely out of whack. Cash, you see, is a perfectly liquid asset. You don't have to pay special taxes just to have it (quite the contrary). You don't need to paint it. It doesn't get vandalized. You can easily exchange it for anything of equivalent value. A house, on the other hand, has many many annoying disadvantages including the preceding, as well as being highly illiquid and generally costing a lot of money (6% in commissions for starters) to convert into cash.

So on the one hand you have $500,000 in cash, which would cost at very minimum $30,000 a year to rent (in the form of a secured loan, but on the other hand you have $500,000 "worth" of house, which would cost an absolute maximum of $20,000 a year to rent. Out. Of. Whack.

Needfully Complicated



If a house is clearly worth X, and someone wants to buy it, but can only front Y, but is probably able to pay off a loan for X - Y + P at an interest rate I that is safely above the interest rate a bank pays its depositors, then it's an obvious winning deal, and the resulting product is simple.

But if the house is not clearly worth X at all, someone wants to but it but maybe can't front any money at all, and maybe can't pay off a loan for X + P even at fake interest rate I - Q, and the lender isn't a bank with depositors, but a speculator with a bunch of juggled loans, IOUs, and chutzpah, then the product gets a lot more complicated.

That's bad enough. It's very bad that financially illiterate people have been royally screwed by a huge, byzantine, greedy, and surprisingly incompetent bunch of people trying to make quick money in seriously out of whack property markets, but the real problem is that the homeloan product has become ridiculously complicated for everyone and will probably never get fixed.

All I need is your social security number



When you talk to a mortgage broker (at least, a typical call center one -- the honest ones are quite different) you'll be asked a whole bunch of questions, typically:

Where is the property?
How much is it worth? (Rough guess is fine.)
How much do you owe? (Rough guess is fine.)
How much are you paying? (Rough guess is fine.)
What do you make? (Ballpark is fine.)
How's your credit? (Ballpark is fine.)
What is your social security number?

Here's the rub. It's OK to guesstimate the amount of money involved, the amount to you owe, how much you're currently paying, and your income, but to get to the next step, they need your social? WTF?

They don't need your social security number at all -- not at this stage. Either they don't need any of the information they asked for (aside from the address of the property), or they don't trust you, or it is what it is: a commitment and consistency play. The next step, after they offer you a completely rubbery set of figures (assuming you gave them your social security number) will be to ask for a deposit which will be completely refundable at signing. The question is, will it be completely refundable if there is no signing?

Getting to Yes (or No)



The fact is that, underneath it all, a home loan that doesn't suck is still a very simple product. You're borrowing Y < X to buy (or continue "owning") a house worth X, and you promise to make the payments or hand over the house. There's no real risk for the lender (since X > Y, and X is very unlikely to drop) and the borrower knows exactly what he/she is signing up for. It follows that a mortgage broker who is dealing in mortgages that don't suck can easily tell whether he/she has a good deal for a prospective customer with just three figures: X, Y, and your credit rating. Based on this, they can either sketch out a very compelling offer, or tell you that they have nothing to offer you at this time.

If the broker needs a day to get back to you with figures, then basically they're figuring out how to sell you a home loan that seriously sucks.

Crunching the Numbers



So, you've given some random person your social security number and they've gone off to crunch the numbers. It will take at least half an hour. More often than not it will take overnight. It's obviously a lot of work, so either they're going to work like mad to get it done in half an hour (they're excited to be working with you) or they'll get back to you in a day (because it's so much work).

In fact, everything they're doing is canned and takes no more than entering a few figures in a standard spreadsheet that sits on their server. It could be even easier, but (a) the companies they work for are spectacularly incompetent with technology (seriously, like you wouldn't believe), (b) the folks selling these products probably can't multiply, and (c) making it easier is not helpful to the sales process; it's already way too easy.

We're talking reciprocity here. They can't just tell you what your situation is instantly because then they wouldn't have done this huge favor for you, their special friend, for absolutely FREE. No obligation!

In fact, if you look at this spreadsheet you'll see that it's easy to calculate a "mortage payment per $100,000 of principal" for any given interest rate, and so all you need to know is the amount and the interest rate and you know exactly what the deal looks like. At 6% a $100,000 30 year mortgage costs $600 per month. A $250,000 mortgage would therefore cost you ... look I'll get back to you in thirty minutes after I crunch some numbers.

The Hard Sell



Well, we've crunched those numbers, just for you (reciprocity). It was hard, complicated work, and we've got a 20 page printout to prove it was hard. Can we fax it to you? We don't actually have email here (or direct phone lines) even though we're a big established company with a fabulous website (authority).

The deal is insanely worse than you were expecting (this is bait and switch which is what you do after playing commitment and consistency), but -- and here's the good news -- it will "save" you a lot of money. Really. The monthly payment will be lower. Really.

And now's the best time to take the deal, because:

The price of real estate is rising (if you're buying) OR
The price of real estate is falling (if you're refinancing) OR
Interest rates are rising OR
Interest rates are about to rise* OR
This is a special deal, just for you for [insert random excuse]

* I've been hearing this one consistently for the last year, and they persist with this obvious baloney even when I quote The Economist. Not only are (as of writing) interest rates consistently falling, but the "R-word" is being bandied about.

The last is the Hail Mary scarcity tactic. If you hear that, you know the person you're talking to is running on empty. This is fear incarnate you're hearing.

That's it. You can tell a good homeloan from a bad one because it's simple. A bad one is complex. That's it. OK, even the simple version seems complex ... how can you tell? Well, if the simple inputs into the equation: how much you owe (before the loan and after) or how much you're paying (now and in the future) are complicated, it's not simple any more ... and either you're a financial whiz finessing tax loopholes (in which case you don't need my help) or someone is screwing you.



Final Note: Some High School Math





You probably don't know how to calculate compound interest payments. Just figuring out how to use Excel's formulae is pretty nasty. The underlying math is very simple.

Let's suppose you're depositing $1 per month into an account that accrues monthly compounding interest of I%. The amount of money you accrue is a geometric series, i.e. after N months:

$1 (today's deposit) + $1 * (1 + I) ^ 1 + $1 * (1 + I) ^ 2 + ... $1 + (1 + I) ^ (N - 1)

OR

1 + 1 * R + 1 * R^2 + ... + 1 * R^(N-1) (where R = 1 + I)

There's a neat formula for calculating this (which I learned in High School; your mileage may vary):

S = ( 1 - R ^ N ) / ( 1 - R )

You may have seen this with an "A" out the front, but I've simplified this by assuming the initial term is 1.

Now this tells us how much we'll have "paid back" at an interest rate of I with payments of $1 over N months.

We can work out how much we owe as: T = P * (1 + I) ^ N. (This is the principle -- i.e. loan amount -- compounded at the same rate for N months.)

So, to find out the monthly payment, divide this result by the earlier sum:

Payment = T / S.

You can quickly verify this formula on your mortgage with one more piece of information: in the US lenders are legally allowed to misrepresent interest rates. In many other countries, lenders must either give you an APR ("Annual Percentage Rate" which is to say "Interest Rate when I don't flat out lie"), but in the US they're allowed to say 12%, when in fact they mean 1% compounded monthly, which is significantly more (12.7%).

That's it.

Friday, December 21, 2007

Is Leopard Apple's Vista?



As I write this, I'm installing Leopard on my Mac Pro, having used it since release on my MacBook Pro, so you can take that as my firm "no" vote.

Here's what's wrong with Leopard as far as I'm concerned:

1) The translucent menu bar is a bit ugly. I think I'll use a command line hack to fix it.

2) The dock with reflections (on the bottom of the screen) looks stupid. I've moved my dock to the left side of the screen, which works well and looks fine. I should have done it years ago (when I started using 16:9 aspect ratio displays) but Leopard forced me to do that, or use a command line hack to fix that as well.

3) Until 10.5.1 came out, my MacBook wasn't going to sleep properly. Now fixed.

Aside from that, Leopard has three compelling features that I was missing terribly when using Tiger:

1) Apple tweaked spotlight to work as an app launcher. I far prefer the improved Spotlight to QuickSilver. If OS X were open source this would have been backported to Tiger, but Apple prefers to make money. Oh well.

2) Stacks are great (if I could customize the icon of a stack they'd be perfect). They finally eliminate the need for something to replace OS 9's wonderful but flawed tabbed Finder windows and the Apple menu, and they're better than either.

3) Spaces is the first virtual display solution I've ever not given up using after a week. The ability to have apps automatically launch into specific workspaces, integration with Expose, is very good (not perfect).

Here's what's great about Vista compared with XP:

Nothing, although I do prefer Aero to XP's default theme visually (there's a low bar). I find the blurry window frames very distracting and ugly, and running Vista the laptop I use it on runs hot if I open a text editor.

Here's what's wrong with Vista:

1) Sluggish waking from sleep.

2) Idiotic confirmation dialogs.

3) Idiotic automatic updates are basically about as bad as having viruses on your computer installed by the Vendor. Only you could clean viruses off your system; this crap is working as intended.

I don't hate Vista. I don't prefer XP. Both are pretty decent ... for Windows.

Tuesday, December 18, 2007

The Echo Chamber



ZDNet has just posted and some website called Electronista has just blogged about and MacSurfer has therefore posted links to, a pile of horseshit about how, according to Secunia, Apple Mac OS X had 234 vulnerabilities reported in 2007 compared to some tiny number for Windows XP and Vista.

I have absolutely no clue how they got these figures, possibly by googling Secunia for every single mention of Apple or BSD and 2007 and counting any hit as a vulnerability. I did click the first specific link for a Mac OS X vulnerability and instead found a report on a vulnerability in Flash Player 8. I don't think they're including application vulnerabilities in the Windows totals (e.g. they're not including that one).

A quick visit to Secunia's site shows that all reports for Mac OS X (10.0 to 10.5 client and server) numbered 27, while Microsoft Windows XP Professional numbered 30.

Again, I've previously discussed Secunia's slight pro-PC bias in choosing a threat level for vulnerabilities ... and just leaving that aside, this idiocy was able to be debunked by going to the source and checking in less than (including posting this blog entry) five minutes.

So much for the blogosphere.

Sunday, December 09, 2007

What's Wrong with the Kindle™?



I'm late to the party commenting on the Kindle, and I don't own one and haven't touched one. The physical layout (e.g. easily mishit next page buttons) is dumb and the UI seems sluggish from videos. I have touched several Sony whatever-they're-calleds: none of them were working properly, but I could see the display well enough to not care for it that much.

Here's my one line summary of what's wrong with the Kindle that no-one I've read seems to have picked up on:

It's yet another damn thing.

Here's a secret to setting the world afire with a new gizmo. It should be better, smaller, more convenient than a market dominating thing at some particular task, e.g. taking photographs, reading books, keeping track of contacts and appointments, browsing and playing music, or making phone calls, or watching videos, or browsing the web. And it should be at least adequate at a bunch of other things you either already do but carry other devices around for, or think you might do but don't because you can't stand carrying around other devices for.

Examples: cell phones didn't really break out of the niche market until they replaced address books and business diaries. There was that moment when hundreds of thousands of business professionals suddenly stopped copying phone numbers from their Filofax (Dayplanner for Americans) and started doing the reverse. Suddenly you got a pocket-sized electronic diary and, oh, look, it's also a cellphone, which is quite handy.

The iPhone is awesome precisely because it's a world-beater at a couple of things you already do (e.g. use an iPod) and it's perfectly adequate at a ton of other things.

I already carry a laptop, a cellphone, a Nintendo DS, a digital camera, and several other gadgets everywhere. I don't want to add a frickin' Kindle. Do I need to take that out of my bag at airport security as well?

Cringely's latest column revives the biggest ongoing unfulfilled rumor in the Apple world (predating the Newton, I believe):

The fact that an iTablet could be a great e-book reader, too, is not a driving reason for such a device, I don't believe. But it's a nice capability. Read the book and watch the movie. Then watch Amazon's new Kindle go up in flames.


My MacBook Pro has a great display and automatically adjusts its screen's brightness to the ambient light level, making it a superb book reader -- if it didn't run so damn hot and had sufficient battery life, it would have everything the Kindle has as a bookreader (except the free Sprint network connection) with the advantage of not being one more damn thing. Rip off the keyboard and add wireless keyboard support, and yeah, where do I buy one?

NBC Universal and the iTunes Movie Store



This little bit of information is probably obvious to some people, but it wasn't to me, and evidently has been missed by a lot of bloggers. Everyone who keeps track of Apple knows that NBC Universal pulled all of its video content from the iTunes Music Store. Depending on your various preconceptions, this is either a sign that Apple is doomed (of course) or that NBC is run by idiots (of course) or some random and elaborate conspiracy theory.

Today, on Sci Fi (which is part of NBC Universal), I saw an ad for Battlestar Galactica Razor, and noticed something interesting: it's on HD-DVD. And suddenly, I realized exactly why NBC Universal pulled its content from iTMS. It's Platform Wars, Episode IV: The Empire Strikes Back.

When you see HD-DVD, think Microsoft. When you see Blu-ray, think Sony, Apple, and most everyone else. That's a little unfair. Some other companies with no specific Microsoft or Apple affiliation have simply been (or felt) forced to pick a side. Others, such as Toshiba, are trying to do to Sony what Sony has done to them. But you know from the existence of MSNBC that NBC has always leaned -- hard -- towards the Microsoft camp.

So it's all about allegiance to the Microsoft DRM-everywhere camp.

I guess the "NBC is run by idiots" theory wasn't too far off the mark.

Friday, November 30, 2007

Just how stupid do you think potential iPhone customers are?



A couple of days ago AT&T's CEO "spilled the beans" about Apple's plans to launch a 3G iPhone in 2008. Apparently, this is terrible for Apple because it will cause potential customers to hold off blowing $400 on an iPhone now and wait for the 3G model.

It seems to me that most potential iPhone customers fall into one of two categories:

First, when I mention to typical Mac or PC users that I recently installed Leopard, the most common response from either group is either: (a) "What's Leopard?" or (b) "Oh, is it out already?" Most people do not keep up with this shit -- they have lives, or at least other interests.

Second, if you're someone who does keep up with this shit, you already knew that Apple was planning to release a 3G iPhone in 2008. How did you know this? Because either (a) Apple is freaking retarded, and yet somehow has managed to release a succession of world-beating products over the last six years or (b) they're developing a 3G iPhone.

So exactly which category of potential iPhone customer will this information give pause to? Well, there are folks in between the two categories of "clued in" and "has better stuff to do with their lives", and that is "idiot fan boys". The question is, just what proportion of iPhone customers fall into that gap?

So going back to the original question -- just how stupid do you (or I) think potential iPhone customers are -- there's one more piece of the puzzle to consider. Who buys someone a cell phone for Christmas?

Oh, and one more thing



Apple does have one trick up its sleeve that its competitors don't. The price it charges for iPhones is not subsidized, so when a new iPhone comes out, folks can just buy it and swap it onto their existing account. This is not the case for the vast majority of cell phones on the (US) market which have totally bogus prices ($99 with a 2 year plan, $299 without it). It follows that Apple can transition users onto new generations of iPhone without waiting for plans to expire, which is, along with a lack of discernable improvements over time, what has killed Motorola's RAZR.

Tuesday, November 20, 2007

Unity 2.0



Unity 2 has been out for a month or so now, and I've gotten my head around most of the additions (everything except multiplayer, basically). It's a bit of a mixed bag (I'll probably write a more thorough review for MacApper) but the nutshell version is this:

The new UI support is at once great (in that it lets you do all the stuff you really need to do fairly easily) and disappointing (in that the architecture is pretty much a horrible kludge). In essence the UI code is all stateless procedural drawing code versus a library of widgets. There's no proper event support, widgets don't have an independent existent or retain state.

The new UI code works, is pretty, and is fairly easy to use, but it's kind of architecturally lame and the code to support a UI is unnecessarily complex and high maintenance, or you need to write your own state-ful abstraction layer. By comparison, Director offered two UI options -- a clunky looking, incomplete, but otherwise functional set of widgets for basic interaction which could be dragged onto the timeline, and a platform-native plugin that let you build "proper" windows with standard controls, but which was completely un-integrated with the rest of your app.

So, on the good side, the new UI code works, looks good, and is integrated fairly well with everything else (cosmetically) ... i.e. your UI widgets don't look 10 years out of date or live in their own Window. On the bad side, writing UI code is unnecessarily tedious and you'll end up reinventing all kinds of wheels... But it does seem like you could write a bunch of wrapper code for all this that could make it not suck (whereas there was simply no way to fix Director's UI issues).

Enough on that topic.

The terrain engine is similarly great (the terrain drawing tools are simply awesome, the results look fabulous, and it's all very easy to use) and incomplete (terrain doesn't work with blob shadows, the terrain shader is kind of limited, and the lightmapper won't take into account trees or other geometry in the scene, so while your terrain can cast shadows onto itself, it can't receive shadows from objects, such as buildings, placed on it.

Several notable deficiencies in Unity remain: undo is still unreliable at best; UnityScript is still not JavaScript (which would be OK if the differences from JavaScript were properly documented, but they're not); and there's still no set-breakpoint-and-step-style debugger.

Crime & Punishment



My wife and I have repeatedly received calls from someone claiming to be Countrywide Home Loans. These people, who give out the number 1-800-641-5302, are not Countrywide Home Loans (we called Countrywide to be sure, and then we googled the number, which is instructive and highly recommended).

Annoyed by these people (who use a combination of professional sounding operators and polished automated systems, so presumably they're not exactly operating out of the back of a van, but who knows?) we contacted Countrywide and told them about the matter. There things have rested for some time.

This has continued for some weeks, and when I got another call today I decided to report it to the "authorities". The recommended course of action is the FTC, but try navigating (a) their website, or (b) their phone system. E.g. after getting several levels into their spectacularly retarded menu system, I was forced to pick between two possibilites neither of which applied, with no way out. I hung up in exasperation.

The do not call list website, for what it's worth, simply generated an error message saying that their server was having some kind of difficulty. Fabulous.

Next, I tried the local police. By far the most helpful and pleasant conversation (with a local Financial Crimes investigator) got me nowhere. She didn't even have a number I could call, but suggested I might try the FBI.

So I called the FBI who simply told me to report the activity on a website. This website is designed (a) for people who have already been screwed (we hadn't because we hung up when we were asked for our SSN) and (b) internet-based crime with phone-based crime as an afterthought.

The site made it clear we should keep hold of any documents (of which there are none) in case the matter ever went to trial, but of course we've not suffered any actual loss, and there's no paper trail. Presumably we can document the fact that the calls took place (assuming the records aren't automatically erased) but that's about it.

This is just cockeyed. Here's a bunch of scammers calling, presumably, hundreds or thousands of people fraudulently, with criminal intent, and giving out a 1-800 number. Surely there's someone in the FBI who can do a reverse lookup of the phone number, at minimum have it switched off, and at maximum tap the line, record their bullshit, and then arrest them.

If there are any further developments, I'll post them.

Tuesday, October 30, 2007

Apple's Aluminum Keyboard



I got one of Apple's new keyboards because (1) it looks really nice (and goes with my Mac Pro in a way that the transparent white one did not) and (2) I wanted to decide what I thought of Apple's little micro-usability efforts (e.g. the fractional delay required to activate the capslock).

So I love the keyboard but for some reason the function key equivalents aren't working for me. So I look at the manual (WTF?!) and discover there's a driver download (WTF2). And then I saw this:



OK, I've seen software bloat, but this is frickin' ridiculous. (How is it that Microsoft's mice work flawlessly in a Mac with no drivers -- although you can install drivers if you want to remap buttons or whatever -- but Apple somehow requires a download to support its own keyboard? Well, maybe they didn't want to just give everyone the drivers because, you know, they're 30 frickin megabytes.)

End exasperated rant.

Friday, October 05, 2007

Fitt's Law



Great article on Fitt's Law here.

This is why Macs have a menu bar at the top of the screen rather than inside Windows.

Thursday, October 04, 2007

World of Warcraft BRIEFLY Revisited



Well, I was really bored and I had some money sitting in my seldom-used PayPal account, so I renewed my old WoW account to see if I might enjoy bashing stuff for a short while. (Yes, I know, flirting with old addictions is very dangerous, but I lived.)

When I quit WoW I predicted, somewhat bitterly, that Blizzard's various borderline insane decisions (e.g. changing raid caps) would drive people away from the game. I predicted that the server populations would drop by 10% or more. Frankly, with time having passed, I decided my judgment had probably been clouded by ire, and that most of the folks who played WoW would probably deal with the stupidity and soldier on.

Logging back into World of Warcraft -- and I should add that this was just me, just one specific server, and that I logged in during off-peak times on a day when Blizzard had announced rolling outages -- it struck me that:


  • the usually crowded areas (such as the hub cities Ironforge and Shattrah) were almost deserted

  • the auction houses had relatively little for sale (and bizarrely skewed distributions of things -- even more bizarre than I remembered)

  • there wasn't anything on sale in the auction house that represented an upgrade for any of my alts (I have a lot of alts), which is pretty amazing since I quit six months ago before a lot of highish end content had been trivialized

  • my main (a hunter) was able to upgrade her (very decent, but not uber) bow for 35gp buyout (not a fluke, there were several such bows on sale at roughly the same price)

  • chat was pretty quiet, even dumb questions in city/general and trade channel spam were minimal

  • no-one I remembered was online (and I knew a lot of hardcore 24/7ish players)

  • there is new content (e.g. new factions to work on, new level 70 quests that require a flying mount to complete, etc.) but it's not interesting. Oh wow, now I can fly somewhere, and collect ten doohickeys. That's different.



My brief tour included both the Horde side (where the alt-guild I had been a member of only had three members who had logged in within the past month) and the Alliance, and both "newbie" zones and high end (level 70+) areas.

From what I've heard from friends who, at least as of a few months ago, were still playing WoW, pretty much every uberish guild imploded about the time ours did (i.e. when we'd tooled up 20+ 70th level characters and were ready to hit the "end game"). Coincidence? I think not.

First, Destroy the Social Glue



In order to hit 40-person raid zones, a guild needs an absolute minimum of 60 suitably skilled and geared players. On non-raid nights, this means that you have probably 20-30 raiders online, and on raid nights, this means you have 35-45 on. Blizzard built new high-end content for raids of 10, 15, and 25, but with all the lockout idiocy of 20- and 40- person content. (Anyone with experience dealing with the ZG 20-person raids knows that lockout was really idiotic for ZG, but this wasn't so bad as no-one cared that much about ZG loot except for three bosses, two of which were easy to get to.)

So now you end up with, say, enough people to staff two 10-person raids on an off night. They either sit on their asses, or they start raiding, causing all kinds of lockout issues on raid night. (This was guaranteed.) You had guilds (like ours) fighting over whether they got to go with the (perceived) "A" team, vs. the suboptimal "B" team formed of leftovers, and then everyone went apeshit on raid-night, when there were two incomplete raids with partially locked-out players, and a whole bunch of folks who had a choice of forming raids without the best players (who were locked out) or joining a pre-existing raid and missing out on the easy loot.

And that's just the fallout from one incredibly and obviously dumb decision Blizzard made in the expansion.

Next, Make The Learning Curve Too Easy, and then Too Hard



Other dumb decisions included making the difficulty gradient for the new raid content way too steep. Pretty much all the content (including dungeon instances) in Burning Crusade is idiotically easy, until you hit raid instances and "heroic" difficulty. Then it suddenly gets ridiculously (as in "figure this fight out by dying ten times") difficult. It's the old story of the frog in a pot of water ... apply heat too fast and it jumps out. I guess a lot of people jumped out. It's like Blizzard forgot one of the golden rules of computer game design (learned about ten years into the industry's lifespan): the customer pays to be entertained; he/she doesn't have to do a lot more to deserve that entertainment, and if you treat a customer as if he/she does need to earn the right to be entertained, you lose the customer.

This insight is most clearly displayed by the change in arcade games some time around 1984 so that you could pay to continue. Stick enough coins in the box, you get to see the whole thing. Better players can get away with fewer coins, but they don't get to see more of the game. All MMORPGs have yet to learn this, but they get away with it by deluging players with so much content/tedium that they might not realize they're missing something. The problem for WoW is that the original game set a high bar, and the expansion did not reach it.

???



No-one at Blizzard with a pocket calculator seems to have done some simple arithmetic, such as "hmm, to get to revered with Enemies of Fred you'll need to kill 234,000 Friends of Fred". Well, the other possibility is that they did, and the spectacular levels of tedium (in terms of factions, keys, and tradeskills) introduced in the Expansion are deliberate.

This isn't surprising. There's plenty of evidence of this lack-of-thinking throughout World of Warcraft, it's just that it has never been piled up in huge, steaming mounds before. It's almost like Sony Online Entertainment was put in charge of Burning Crusade (but then there would be better itemization).

Profit!



Assuming, and this is a huge assumption, my experiences in my 1h return to WoW were vaguely representative, it seems like Blizzard has lost not 10% of its players, but more like 50%. This is WAY more moribund than I would ever have guessed WoW would become in such a short period. Heck, I'd be shocked if EverQuest servers (merged as they are) would be this dead. That is simply astonishing, and worth a major post-mortem.

Thursday, September 27, 2007

The Amazon Music Store: Usability & Motives



Like many people, I like Simon & Garfunkel. And like a sizable proportion of such people, I like Simon without Garfunkel. And finally, like a not-so-sizable proportion of such people, I don't mind Garfunkel without Simon. As it happens, I really like one of his most saccharine songs, "Sometimes When I'm Dreaming". I've been looking for it for some time -- I had it on vinyl, but I didn't like Garfunkel enough to replace my vinyl with CDs, and I haven't seen that specific pop song appear on ITMS.

Oddly enough, the album it's supposed to be on is on ITMS, but that song is missing, and the album is not flagged as incomplete. Well, I thought, this is a great opportunity to try out the much vaunted Amazon DRM-free music store. Feel free to go try to find some stuff on the Amazon music store while I stay here.

Anyone claiming the Amazon music store is as convenient to use as ITMS (whether or not you have an iPod, I might add) is on crack. Oddly enough, it seems many of Apple's uncritical "fanboys" are even bigger "fanboys" of anything without DRM and so have been singing the praises of a service that offers some notable bargains over ITMS while lacking range and convenience.

I'd certainly be very happy to buy stuff from Amazon versus Apple (once I *found it* using ITMS or store) to save money and/or avoid DRM -- and I expect this is how Amazon will succeed -- but Amazon should be recognised for the parasitic "Burger King" strategy it's adopting on the usability side, and the motives of the cartel backing it which allow discounting and DRM-free music on the business side.

MacDonalds is famous for picking sites for its "restaurants" very carefully. Burger King is famous for placing its "restaurants" close to MacDonalds. Get the idea? Apple produces a market for digital music, fabulous tools for browsing, buying, and selling that music, and so forth. Amazon basically knocks together a half-assed website (e.g. when I tried to listen to tracks, it said I needed RealPlayer) but it offers discounts and no DRM. Exactly the way Amazon relies on brick and mortar retailers to let people browse and select books, it relies on Apple to let people browse and select music.

As for price and DRM-free music. Is Amazon offering the labels more than $0.70 per song? Apparently, the record companies which think Apple is greedy for only paying $0.70 per song in royalties out of its $0.99 less operational costs, R&D, marketing, etc. are OK selling songs encoded at double the bit-rate, DRM-free, for $0.10 less. Well, maybe Amazon is paying them a larger percentage, so let's look at Apple's DRM-free offerings ($1.29 each): presumably the music labels prefer to be paid some percentage of $0.89 by Amazon than $0.91 by Apple for the same track.

Wow, what does this sound like? It sounds like dumping! The most classic form of anticompetitive behavior by a monopoly or cartel (and the record labels are in fact both -- since only one label has the rights to, say, Simon & Garfunkel's music, and music by, say, Sting, isn't a substitute). So what we're seeing here is Amazon acting as a stooge for the music industry which would like to break the power of Apple's ITMS by dumping their music through a rival channel.

Well, I wouldn't be too concerned for Apple. First of all, Amazon has been selling stuff through its website since, what, 1995, and the user interface has been very slow to improve. Based on the current gap in usability between their MP3 store and ITMS, it will take them approximately forever to catch up, long before which Apple's usability folks will have incorporated telepathy and precognition into ITMS.

In the long run the music industry is, to put it simply, dead. They can thrash around suing teenagers and setting up gigantic anti-competitive dumping grounds to what we can optimistically call their hearts' content, but in the end, music has gone digital, digital stuff is easy to duplicate, and the music industry (as we know it) is an artifact of technology (vinyl records and CDs) that is obsolete. It didn't exist before we could record stuff, and it shouldn't exist now that recorded stuff is trivially easy to duplicate and broadcast.

Wednesday, September 26, 2007

Pixelmator: Another Core Image-based Bitmap Editor



Note: This article was originally written for MacApper, but it looks like I've been beaten to the punch, and by John Siracusa no less!

If you haven't been paying attention, Pixelmator is an attempt by two brothers to write a serious image editor based on Apple's Core Image functionality and sell it dirt cheap. There's been some talk of Pixelmator being a serious Photoshop alternative. So, is Pixelmator a Photoshop replacement, and if so for whom? Even if it isn't, is it a worthwhile program on its own?

Note: Pixelmator can be downloaded and tried out for free, and I recommend you do so rather than taking my word for anything.

Look & Feel



"If you can't think of anything nice to say, come over here and sit next to me."
Dorothy Parker



It's good to know that we've developed all this technology to exchange photos of cats.


I want to like Pixelmator. It's fairly cheap, it looks pretty snazzy (if you like translucent black palettes, and even if you don't, it has gorgeous icons), and it seems like it has a lot of functionality (lots of palettes and stuff). There's some enormously cute stuff in its user interface, such as the dynamic "string" which joins filter-settings floaters to their focal points:

It's so cute (not the cat, the string thingy)! When you drag the dialog it bounces around realistically. But look further and you'll note that instead of implementing useful stuff (like direct manipulation of angular parameters) they've wasted time doing a cool string thingy.


There's no question that Pixelmator has a bit of sizzle, but is there any steak?

Intended Audience



Photoshop is many things to many people. Originally developed by the Knoll brothers who were then (and still are, I believe) working for ILM, it may have been conceived of as a photo editing program, but for a long time most people didn't have many decent digital photographs to edit. Photoshop was, and continues to be, the premiere tool for post-processing digital bitmap images of any kind.

To this end, Photoshop has a lot of functionality which no program developed by a couple of guys in a year or two can seriously hope to emulate. The key to competing with Photoshop, then, is to pick a target audience and target its needs carefully.

Digital Photography: I use a Nikon D50 and shoot RAW. Pixelmator won't open .NEF images. For many digital photographers this essentially makes Pixelmator completely useless.

Assuming you shoot JPEGs or you use some other program to process your RAW images (and Pixelmator can import JPGs from iPhoto very easily), Pixelmator lacks some fairly crucial tools: there's no straighten tool (although arbitrary rotation is possible) that I could find, and the exposure tool (which is kind of a bad joke for a program that doesn't read RAW images anyway) is hideously crude. The Levels tool is there but it doesn't allow you to set numerical values (including gamma). There's no Curves tool. There's no red eye tool. So, basically, Pixelmator may be useful as a companion to a program like iPhoto (which has most of these things), but you'll end up having to go back and forth (and hope your initial conversion from RAW was spot on). So, for digital photographers, Pixelmator is pretty much a joke.

Digital Painting: Aside from offering limited brush customization, Pixelmator has nothing to offer a digital painter, and ArtRage 2.5 is both cheaper ($25) and far more functional.

Graphic Design: Pixelmator has no vector graphics functionality. You can't build a layer with an editable rectangle in it. Text layers are editable (good) but that's about it. You can't perform transforms on them, you have to go in and set font sizes. You can't draw a rectangle and have text automatically wrap inside it. There are no typographic controls whatsoever. Aside from text, soft shadows aren't supported, so you'll be doing those manually. And there are no layer styles. So you'll be doing them really manually; think back to Photoshop 3.

Video Compositing: You can't open movies directly, so you'll need to convert the frames of video into images first, and then back into video afterwards. This is sufficiently horrible that you will give up and use another program. This is a shame, since Pixelmator does put a very good interface on top of most of the Core Image filters. (The programmers seem to have simply avoided implementing the filters they couldn't wrap a nice interface around.)

Pixelmator also appears to implement some quite sophisticated masking functionality from Photoshop (e.g. you can clip layers to a layer beneath them, a really neat Photoshop function) and you can create layer masks (although it is inconvenient and unintuitive to set them or edit them). This would be icing on the cake if Pixelmator did the basic stuff well, but doing advanced stuff well and doing basic stuff badly or not at all makes this functionality essentially pointless. Anyone sophisticated enough to appreciate sophisticated masking functionality will be offput by lousy selection tools.

Highly Technical Users: people who want precise channel controls, formula-driven filters, etc. -- look elsewhere. If you want a good, dirt cheap front end for the Core Image filters, get Acorn. If you can't afford Photoshop and can stomach The GIMP, well The GIMP is free.

So, all this begs the question. For what audience is Pixelmator intended? For Photographers, it lacks RAW support and the most commonly used Photo editing tools aside from Levels, Crop, and (an astonishingly slow to undo) Rubber Stamp. It would be fairly easy to add most of these (especially once Leopard comes out with robust OS-level support for RAW images), so it's possible that Pixelmator is really just a public beta of the post-Leopard version. For digital painters, graphic designers, and video compositors it's basically useless. For highly technical users, such as myself, it's not even a toy.

Dumb Stuff



Photoshop is much lauded for its excellent user interface (sure, it's cluttered, but when you have about 5000 distinct functions, there's no way to avoid this) and sensible, consistent keyboard equivalents. So it's pretty amazing (given its intended audience) that Pixelmator lacks simple things like:

No Command-L for Levels. In fact Command-L does nothing, and you type Command-Shift-L for Levels. That's just dumb.

No Command-T for Transform. That brings up the Cocoa Font floater. OK, that's sort of a standard (of course, it doesn't fit in with Pixelmator's "any color so long as it's black and translucent" interface).

Magic Wand tool doesn't appear to do anti-aliased selections. There's no "new layer from selection" functionality, so you make a selection, and then cut/copy and paste it to make a new layer. But the layer won't be in the right position.

You can't drag across layers to turn several on or off at once. You need to click each layer's check box individually. Annoying.

Multiple Layer selection doesn't work properly. You can't select two layers and drag them around together.

No Delete to fill selection with Background color, Option-Delete for Foreground Color. Try Command-Option-F for fill, and foreground is your only option (so no background fill and no pattern fill).

But the kicker is: Bad* selection tools. Here's a simple exercise I do with every bitmap image editor I ever use:

1) Draw a rectangle.
2) Get the marquee selection tool.
3) Carefully place its crosshair in the top left pixel of the rectangle, and drag select down and to the right. See what you got. Repeat for each corner of the rectangle. If your selection doesn't include the pixel your cursor started on, drag the program to the trash. Or, if it has indispensable functionality (like GraphicConverter) use it for what its good at, but never ever use it to edit images.

* Actually I'd use a much nastier word than "bad" here, but this is a family web site.

Conclusion



Pixelmator is expensive enough ($59) that people will probably think twice before just paying for it. My guess is that it will sell a few licenses to people who are impressed by its sizzle and don't notice that there's no steak. As an image editor it is inferior to less hyped programs, such as Acorn ($39.95), and it has no really excellent core functionality which some target audience will find indispensable to fall back on. It doesn't implement all the Core Image filters, its layer controls are merely adequate, and its painting tools are all flawed (e.g. the rubber stamp is very slow to undo, which is a very common operation when using the rubber stamp).

What I can't tell you is whether if I'd never used any other bitmap editing program I would think Pixelmator was great. I can't unplug chunks of my memory and look at Pixelmator with fresh eyes. But I can tell you this: MacPaint 1.0 had better selection tools.

I guess now we wait for Iris to come out.

Aside & Disclaimer: I've been using Photoshop since version 1.0 in 1991. Before that, I used Thunderscan, Digital Darkroom, Fullpaint, and Studio/32. Since then I've used ColorStudio, Fractal Painter, and pretty much every other bitmap editing tool on the planet. I love Photoshop but kind of hate Adobe.

Monday, September 24, 2007

Vista, The Gift That Keeps On %$#@ing






So, today, my copy of Vista Home Basic (retail, full price, etc. bought for me by our IT department to test our stuff on) demanded that I activate it. OK, annoying and alarming, but I'll bite. So I clicked the Activate button (or whatever it was) and held my breath. A few seconds pass, and then "Activation was Successful". OK, so a bit alarming, scary if I were -- say -- not online at the time, but no biggy.

Then, a few minutes later I notice the following at the lower right of the screen:

"This copy of Windows is not genuine."

Slightly panicking (only slightly, because I really don't give a rat's ass) I search the help for "not genuine" and follow the most appropriate seeming link. I eventually activate Vista again. Same thing -- Activation Successful.

And the message still reads:

"This copy of Windows is not genuine."

I guess XP is genuine.

Thursday, September 13, 2007

Moveon.org: you are an embarrassment to us all



Don't accuse people of being traitors for doing their jobs. Yes, you didn't use the word "traitor" or "treason" -- you used the word "betray". Oh yeah, and you made play of a guy's name.

Moveon or Moron? Hahahaha so clever.

Seriously, as a lifelong Democrat, I tell you guys to get a clue. You've just managed to completely miss the point.

Instead of insulting people, ask them intelligent questions. Here are the questions I would ask General David Petraeus:

1) Do you think we will win in Iraq?

2) Define win.

3) What is the likely cost of our current course of action? (Lives, dollars, etc.)

4) What is the likely outcome of serious alternative courses of action, such as leaving, or invoking the draft and sending in 200k more soldiers?

5) What is the likely cost of leaving, or invoking the draft?

If the answer to any of these is "I don't know" then who does? Let's ask them.

Once we have reasonable answers to these questions, then we can decide for ourselves whether to stay or go. Until then, we don't have enough information to make a reasonable decision, in which case I'd err on the side of saving money, blood, and prestige and leave as quickly as possible.

Next Generation Computers



We've by now all seen the new iPods, and of course the iPhone. I also saw an interesting post on Mac 360 to the effect that Macs are kind of boring, with no real changes (aside from better performance) aside from the introduction of the Mac mini in five years. Good point.

The obvious next thing to fold into the laptop is cellular internet (3G, whatever). Having to stick a card into your laptop to get wireless internet sucks just as much as having to stick a card to use a modem or an external hard disk or whatever did five years ago, and Apple should address this. (My pet suggestion was to build Macbooks with an iPhone slot in them, but that would represent a huge waste of space for those of us, like me, who haven't bothered with an iPhone).

It seems to me that there are, broadly speaking, three niches for computers today: desktop, inconveniently portable (i.e. notebooks), and conveniently portable (i.e. pocket-size). There's also the data storage unit which may or may not have playback and ancillary devices attached (i.e. the "datastick", a.k.a. iPod).

I've been shopping for audio recorders lately, and this just reminds me of the fact that we all still need a datastick, and the iPod still isn't a datastick. The iPod classic isn't because it doesn't have a small general-purpose computer in it, although 80GB/160GB of storage is just dandy. The iPod touch isn't because it doesn't have enough built-in storage and/or removable storage. Neither have convenient cameras and audio recorders built in, and, frankly, both need built-in speakers -- even if they're crappy.

So this would be my computer lineup (in a perfect world):

Desktop


Mac Pro (like current Mac Pro, but smaller, second CPU optional for base model, ~$1500).
Mac Mini (like current Mac mini but a bit bigger -- room for real video card and hard disk).
Mac Nano (like current Mac mini but possibly Flash RAM based and smaller)

I think the iMac is intrinsically evil -- because it makes you toss a good monitor when your cpu starts to age. Build monitors with a bracket for a Mac Mini/Nano instead.

Laptop


MacBook Pro (like current MacBook Pro, but provision for internal cellular internet, new style keyboard, CPU is a Mac Nano which can be swapped out / docked.
MacBook (like current MacBook, but Flash-based and thin).
10" MacBook (Flash-based)

Laptops can function as iPhones/cellular net devices if you have the optional receiver (as with Bluetooth options a few years back).

Pocket


Pocket MacBook (Basically a clamshell iPod touch with faster cpu, slightly larger screen, real keyboard. Oh, and it's an iPhone too.)
iPhone/Mac -- but fully unlocked, doesn't pretend it's not a Mac, can work with keyboard accessory which doubles as a stand and dock.

In Summary



Apple should completely blur the distinction between Mac and iPhone and iPod Touch -- making the halo effect irrelevant. There are 100,000,000 iPods out there. If 25% of them get replaced, turn those 25,000,000 new iPod users into 25,000,000 new Mac users. Declare victory. Withdraw from Iraq... Oh wait, that's part of my Steve Jobs for president rant... Never mind.

Wednesday, September 05, 2007

iPhone, Thou Art Dead To Me



Apple has released the perfect iPhone ... i.e. an iPhone that isn't saddled with an AT&T plan and, um, doesn't make phone calls (...yet). At last, a browser perfect for reading books on the toilet.

They've also added custom ringtones on steroids to the iPhone (you still can't just grab some arbitrary piece of audio and make it your ringtone because, um, well because) and you can spend money at the iTunes store anywhere you get cell reception and have your iPhone, instead of merely anywhere you have internet access.

And it's about the same price as a similar gadget from Nokia that doesn't do most of the stuff it does, looks a whole lot clumsier, isn't touch-based, and appears to have lower battery life. But it does support some version of Flash. Obviously, the new iPod Touch is grossly overpriced...

Numbers, Revisited



OK, here's my second opinion on iWork '08. It's incredibly good, just frustratingly imperfect.

As I think I mentioned, I have two "killer documents" that I try to work with in every word processor and spreadsheet that comes along. The document is a complete set of role-playing rules, featuring complex table styles and cross-references.

Pages comes so close to handling this well I can taste it, but not quite there yet. Still, the only program to ever handle this document gracefully has been Adobe Framemaker (formerly just Framemaker). It's kind of hard to complain that a simple word processor aimed at folks writing family newsletters to send out at Christmas can't handle a long, hideously complex document effortlessly.

Second, Numbers is annoyingly missing some features such as multiple heading rows and vertical or angled column headings that would just be thrilling, but I've managed to use it to implement a working character sheet (as in fully automated) in about two hours, and rewrite my entire magic system (including 300 or so mix-and-match spells) in a couple of afternoons. Not too shabby.

I might note that Word and Excel are equally flawed in their handling of both documents, harder to use, and quite a bit pricier. Oh and slower and not yet Universal Binary.

Sunday, August 19, 2007

iWork '08 > I wish I liked it


As usual with Apple products, there are lots of reflexively pro- and anti- reviews. Most of the reviews focus on Pages, because Keynote is so obviously the best presentation program around that there's no point even discussing it, and the vast majority of people don't use spreadsheets for anything serious.

I've got a few documents lying around that have been through every word processor or spreadsheet option there's ever been. My pons asinorum for word-processing is the ForeSight rule book, a horribly complex document featuring large, complex tables, graphical diagrams, indexes, cross-referencing, footnotes, margin notes, and more. The only programs that have ever come close to dealing with it are (in order of best to worst) FrameMaker, Microsoft Word, and Fullwrite Professional. The first thing I did after installing iWork '08 was import ForeSight in its latest incarnation from an Word (2004) .doc. It imported almost without a hitch (it warned me that some of Word's more esoteric formating options aren't supported) but after working with it briefly in Pages I am inclined to persist with Word.

My equivalent document for spreadsheets is an interactive ForeSight character sheet which does all your book-keeping for you automagically (in essence, a freeform modeless character creation tool). I've never managed to build one of these without failing to implement some of the rules, but the closest I've gotten has been using FileMaker Pro. I built the character tool from scratch in Numbers in about two hours: by far the easiest implementation I've ever managed thanks to the nice way it handles tables, but the irony is that Numbers fails on the cosmetic front! (Not that FileMaker Pro, Excel, Wingz, or Claris Resolve did better cosmetically, but given Numbers's close relationship to Keynote, it amazes me how little attention its layout functions have received. For example, you can drag out ruler guides into your sheet, but they're always editable, so it's impossible to drag a table edge that's near one of them -- you always end up hosing your guide.

The table implementations in Pages, Keynote, and Numbers are similar, but subtly different, which is infuriating on its own. Pages has excellent stylesheets which work very badly with tables. Numbers has table styles, but they don't translate to Pages. They also have some mysterious limitations and odd behaviors. E.g. if you copy and paste cells, format moves with them, even into headers. I ended up copy stuff to TextWrangler, then copying it out of TextWrangler back to Pages to clear formating. Header cells can't include calculations or be included in calculations. You can only have one header, footer, and side-header row. If you a column or row contains merged cells, it can't be hidden (and it's not clear why; it took me ages to figure out what was going on).



But what really annoys me about Numbers, what is truly egregious, is that the metrics of tables are non-deterministic. I built a custom table style, and then put two identically styled tables side-by-side. Guess what, their rows don't line up. I cannot figure out how to fix this and it's annoying as hell. I've read here and there rants about certain aspects of Cocoa's graphics being utterly, deeply, and profoundly broken, and this appears to be an example of it.

On the whole, I'd rate Keynote as being as awesome as ever, Pages as being nice for casual stuff but broken for anything really complex, and Numbers as being great for casual stuff but limited.

Tuesday, August 14, 2007

Windows vs. Mac Security. One of these operating systems has a destructive virus built in



Oh the irony. So here I am watching the last Steve Jobs keynote (the aluminum iMac introduction) on my Dell Windows Vista laptop (the one I use for testing the software I write, and incidentally use to surf the web when in bed) and Windows logs out on me without warning.

Why? Well to update Windows of course.

It's funny how Windows thinks that it's OK to shut down my computer without so much as a by your leave in order to patch itself, since -- presumably -- the reason you patch your computer is to fix security problems and bugs, each of which could potentially cause your system to crash without warning or corrupt your data.

In contrast to this, when my Mac patches itself, its updater patiently waits for me to restart.

I'll take the OS without malware built in by design, thanks.

Sunday, August 05, 2007

iPhone, Nintendo DS Web Browser, Compatibility



I'm a sucker for weird web browsers, so I just had to go and buy the web browser for my DS. It uses both cartridge slots -- a memory module goes into the Gameboy Advance cartridge slot, and the browser itself goes into the smaller DS slot.


This shows my flexible inVue test page running on the DS's browser (correctly!). Yes, I realise some of you, who didn't already, hate me now...



The DS's web browser is written by the Opera people, and it represents a fine example of why Apple rocks and other software companies don't. It is possible to configure this browser so that it is reasonably usable and makes sensible use of both screens, but it's certainly not obvious how. The screen you're seeing in the image is the lower (touch) screen, and the blue rectangle is the zoomed in portion being shown in the upper (non-touch) screen. In this mode you can press some buttons to toggle a mode which lets you click in the touch screen area. In the DS browser's default mode, however, web pages are displayed in mangled form (basically ignoring most formating) across both displays, treating the two as a single tall, narrow display. This mode is almost entirely useless and disconcerting.

I visited an Apple store and asked one of the helpful people if I could try out an iPhone in EDGE mode. They very helpfully showed me how to disable the WiFi support, so I was able to get a more realistic view of the iPhone browsing experience at its worst, and it is pretty nasty, at least on ludicrously heavy pages like gamespot.com. Unlike the DS, the phone can handle multiple pages at once, and load pages in the background -- so you can diddle around on your iPhone or drink coffee while the page loads, but it's still darn slow. That said, the DS is painfully slow with a WiFi connection and appears to cache absolutely nothing (not even stuff that isn't currently being displayed in the current web page, so just scrolling a page makes you want to hurt someone).

But my original reasons for doing all these things was to make sure my ad code works properly on exotic platforms, and it does. (My video ad code has issues related to Safari/iPhone only supporting a subset of QuickTime codecs and not supporting Flash at all, but that's not really under my control.) It astonishes me how little effort it takes to make quite complex bits of JavaScript work properly across browsers, and yet how much terribly incompatible JavaScript is out there, even on quite major sites.

Thursday, August 02, 2007

While I wait for a better iPhone



Well, I finally played with an iPhone (that's right, despite all my posts about it, I didn't queue to buy one, and didn't even visit an Apple Store to see one for weeks after the release). Mainly, I wanted to see if my video ad serving technology worked on one (it does, but since the iPhone only supports a limited range of video options, most of my test videos wouldn't play). My conclusion is that I won't buy one until it has a ton more storage (preferably with removable media as an option) and better bandwidth.

Assuming that what you basically want from a phone is (a) a pretty good phone, and (b) something to surf the web, and (c) that you already have an iPod ... the missing component for me -- and I suspect a lot of people -- is the web browser. And given that you probably find surfing the web via EDGE to be pretty unbearable, what you really want is a Wireless web browser with decent battery life that is rugged and fits in your pocket. Ideally it will be cheap enough that if you lose it you won't be shattered psychologically and financially.

Well, Nintendo has released a web browser cartridge for the DS (it's Opera, of course). Darn it, I wish they'd simply add a physical keyboard and an IDE.

So, I now have my 5G iPod, Motorola Razr, and for $30 I can convert my $130 DS into a browser. Downside of course are significant: (a) three gadgets vs. one; (b) no cellular internet (well I could have it on the Razr but why bother?); (c) smaller screen; (d) no spiffy touch interface (the DS's touch interface is kind of pedestrian); (d) web mail is the only email option (and it's not cellular); (e) no integration: if the phone rings you need to turn something else off to talk; (f) none of the really great functionality you get from synergies (e.g. camera + email, web + email + phone); (g) the Razr, on its own, even with Bluetooth enabled and set up, is more of a pain to synch than an iPod, and the DS can't synch at all; (h) and it's even geekier than having an iPhone, and some folks will think you're infantile for using a DS in public.

Upsides are (a) each device individually has more battery life than the iPhone (although with every house, office, and vehicle I have access to festooned with iPod docks, cradles, and chargers, iPhone battery life seems like a minor issue); (b) the DS browser arguably has a better keyboard (pen-based); (c) the DS is insanely rugged and doesn't look that great to start with, so I won't get worked up over nicks and scratches; (d) far more storage (30GB in my case); (e) you can, apparently, play games on the DS.

When you weigh the pros and cons, the iPhone is definitely better overall than the iPod + Razr + DS combination, and the base model is even cheaper ($499 vs. $249 + $99 + $129 + $29). On the other hand, the marginal benefit of paying $29 to let my DS surf the web will allow me to wait for MacBooks with iPhone functionality or an iPhone with decent storage capacity, better broadband, and the 1.0 kinks worked out.

The gPhone



Google is, apparently, working on its own phone. Google isn't exactly a stranger to the hardware world -- they do all kinds of hardware work internally (ranging from building their own infrastructure, to cargo containers that contain a decentralized server hub that can be shipped anywhere and plugged in, to immense, highly optimized server farms) and even sell some hardware products (enterprise search engines that can be installed on a corporate LAN and remotely administered). But Google is a stranger to consumer hardware.

Alan Kay once famously remarked that if you're really interested in software, you build your own hardware, but the more I think about this, the less it makes sense to me for Google to release its own phone hardware (except possibly as a reference platform).

Economics 101 dictates that you want your complements (products that help consumers use your product) to be free or cheap and ubiquitous (makes sense) and competing products to be expensive and rare. For Google, web browsers are their ultimate complementary product. If FireFox and Safari didn't exist, Google would have had to invent them.

So what Google really wants is for every cellphone out there to be a web browser with full "Web 2.0" support -- i.e. basically an iPhone. But to do this it needs to make them good, cheap, and very common -- something Apple can't or won't do.

It seems like the best way to do this would be do produce a great cellphone OS and license it for next-to-nothing. This would simultaneously help push Microsoft out of this space and turn lots of cell-phones into Google-friendly web-browsers. Rather than having to figure out how to manufacture and sell phones at a profit, Google would simply help existing phone companies to do this, and make more profits the way it already does: via web advertising.

Google would probably be just as happy if every cellphone became a standards-compliant web browser without their help. The question is whether Google needs to do anything (now that Apple has essentially raised the bar for cellphones across the board) except wait.

Friday, June 29, 2007

It's. The. Usability. Stupid.






So you have room for nine icons (almost) on your main screen. Do you (a) use one for a "clock" rather than, say, display the time somewhere in your utterly useless status bar and menu bar? (b) use one icon for a "clock" and another for "date and time" (given you're already showing the date anyway)? (c) use a third icon for "calendar" because two just wasn't enough? or (d) add a Windows 98-style gradient bar up the top to waste even more space? If you answered (d) you're ready to design Open Source UIs and take on Apple in the consumer space.


I saw something pretty funny on Digg yesterday. The link didn't work (which was a sign in itself) but googling got me this. Here's the summary: real soon now™ there will be a Linux-based smart phone with 3G network support and a touch screen that does everything the iPhone does, only better, and runs Linux -- sorry, GNU/Linux -- and is totally, utterly open. So it will be better than the iPhone in every way.

Woohoo! At last I can use something other than my tin-foil hat to communicate with the mothership.

Here's the problem. Aside from being "open" ... pretty much any non crap cell phone does everything the iPhone does ... at least to some extent, and is more "open" to third-party development. The iPhone isn't different and better than those phones the way, say, a current MacBook Pro is better than say an Apple II. It's better than those phones in the same way that a MacBook Pro (running Mac OS X) is better than a MacBook Pro (running GNU/Linux). When folks suggest Apple has a five year lead on rival cellphone companies, they mean that Apple's software is five years ahead of rival cellphone software the way Mac OS X is ahead of, say, GNU/Linux. (Since GNU/Linux is actually about five years behind Windows, it's more like an eight year lead on GNU/Linux.)

And the lead isn't in features. Every computer is a Turing machine limited by finite RAM. The only fundamental difference in ultimate capabilities between any two computers is their peripherals and data capacity. The difference for users is in usability.

As my father used to say, "Chocolate is good. chicken is good. Chocolate-coated chicken must be excellent." It works even better when both ingredients suck individually, right? Linux is a usability nightmare. Cellphones are a usability nightmare. But a Linux cellphone is going to rule! Kind of like Kentucky-fried chicken smothered in rancid chocolate.

Thursday, June 28, 2007

It's a pocket-sized one of these...






I just read one of the most intelligent articles I've come across in the last few months, and it wasn't in the New Yorker. I recommend you click the link and read it, but if you prefer an executive summary: the iPhone is a pocket-sized networked computer that replaces all the crap you currently need to carry around to do business (i.e. PDA, phone, laptop), is cheap enough that you can buy it yourself rather than wait for IT to relent and support it, and it's being sold as a phone because people understand phones.

I remember when the Newton came out in 1992 (or was it 1993?) and I thought it was going to be equally disruptive. In the end, the Newton failed largely because while it eventually did more-or-less everything it set out to do very well (as of the MessagePad 120) it wasn't a rounder wheel -- it didn't replace anything you already needed to carry around, it was just a really good ... whatever it is that it was.

The iPhone is a better phone than your phone, a better iPod than your iPod, and a better laptop than your laptop (well ... it's smaller, has better battery life, and it always has a 'net connection). OK, it won't replace my laptop across the board, but it certainly can replace my phone and iPod, and I'll always have it handy, whereas I don't carry my laptop with me when, say, I go shopping. So if I see an interesting game, I can't look up reviews of it until I get home.

Oh, I'm not buying an iPhone until I see what the next version or three look like. Specifically, I want more storage capacity and SD media support. 7.2 GB just doesn't seem like nearly enough.

Friday, June 22, 2007

Windows Vista the most secure Operating System



Disclaimer: I use Vista for testing and casual web browsing and Mac OS X for web and software development. I use both nearly every day. I've had no security issues with either. That said, Vista's "allow or deny" behavior is probably about as annoying as spam or popups.


Various sites (e.g. ZD-net and Engadget) are essentially regurgitating some Microsoft press release (complete with graphs, it appears) on a Microsoft-funded "research" project which shows Vista to be the most secure OS ever released (with XP coming in second -- which kind of screams credibility right there).

Secunia is my favorite security site for two reasons. First, even though like most security companies it has a vested interest in promoting Microsoft (since almost every Microsoft user pays for some form of virus protection and almost no-one else does) it seems to be relatively impartial. Scandinavian sensibilities, perhaps. Second, it gives you pretty nice graphs.

Apparently, according to Secunia, Mac OS X (versions 10.0 Public Beta thru 10.4, client and server) is one product while, say, Windows Vista is one product (and, more interestingly, Windows XP Professional is one product). This means that when you look for security problem statistics, Windows Vista is in its own separate category, while Tiger is lumped in with the 10 or so other versions of OS X. Secunia also tends to downplay the severity of Windows issues and overstate the severity of Mac OS X issues (yes, if you download a malicious script file, run it, and type in your admin password when asked, it can take over your system) -- but we'll let all that slide (especially since I've ranted about it in the past).

Here's the story in pictures:









The gorey details are here.

And here is all of Mac OS X since 2003 or so for comparison:









The gorey details are here.

And finally, to give you a good laugh, here's what this "research" claimed was the second most secure OS:









The gorey details are here.

This data is live -- so it may change after I finish this post. But right now as I look at it, Mac OS X has a better record historically, and fewer issues since Vista's release than Vista. And XP -- which according to this same "research" comes in second to Vista and ahead of OS X -- has a track record based on these statistics (and much personal experience) which is simply embarrassing.

Tuesday, June 12, 2007

Safari for Windows, Mac, and probably iPhone found to have tons of security holes



As noted here and many other places, Safari turns out to be full of security flaws at least some of which are in the production (2.0.4) version as well as the 3.0 "beta" (it doesn't show beta in its About box).

Safari on Windows is proving pretty buggy for me, it doesn't save preference changes among other things. (Ironically, it crashes when I try to view a MacWorld Blog page complaining about the uninspiring announcements at WWDC.) Personally, I think it's nice to see security flaws in Safari exposed because, hopefully, Apple will be forced to fix them. The nastiest exploit I've seen tricks Safari into running arbitrary command lines under Windows (via cmd.exe).

Some nice information on Leopard's under-the-hood improvements



HardMac notes that Leopard has many improvements under the hood including UNIX-03 compliance, multicore optimization of network layers, automatic TCP optimization, and security features such as sandbox options for applications and notifications of an application having been altered since it was installed.

Strange Omissions



Rumor has it that, at least initially, the iPhone will lack Flash support -- argh! (Even the Wii has Flash support.) Yes, that is disappointing, but it doesn't explain why Apple's email announcement to people, like me, who signed up for news on the iPhone's release doesn't mention web browsing.

Oops.

But the really sad omission is...



It really looks like the iPhone SDK non-announcement is a little hasty. Yes, you can develop perfectly good 3rd party apps for the iPhone using a web server, but no, Apple hasn't added some perfectly obvious bells and whistles to make this seamless:
  1. There appears to be no way, out of the box, to "bookmark" or "force cache" a page and make it launch directly from the main menu.
  2. The Safari UI doesn't appear to be hideable (yes, the address bar auto-hides after a period of non-use, but it needs to be possible to hit all trace of the browser from JavaScript or something).

Technically, these are minor omissions and easily fixed, but boy would they have made great demos (versus the glorified phone book app that was shown).

Build a simple game using DHTML. Make it 480x320. Go there with your browser. Click a button to "appify" it. Boom. Third party game app. (And then demonstrate how it automatically updates the cached version when appropriate.)

Go to Google documents and "appify" it. Boom. You have a real word processor on your iPhone.

These are low hanging fruit, and it's rather lame that Apple didn't have at least one moderately cool demo to show.

Monday, June 11, 2007

Uh oh, AAPL is down $4



In general, when Wall Street responds poorly to Apple announcements it's a sign either that Apple's announcements were lame OR that Wall Street doesn't understand the implications. Remember that the iPod announcement was received with yawns (including from me) and so was AppleTV. We'll see.

Going back to my reaction to the announcements at WWDC 2006 (last year), it still seems to me that Time Machine is a killer feature. Just ship it and have it not suck and I'm sold. Stacks is also a killer feature. At last, your desktop can actually look pleasant without constant maintenance. (It's sad how much time I waste clearing up my desktops on both Mac OS X and Windows.) I should point out that the Apple Menu and Tabbed Finder windows in OS9 are long overdue for replacement, but stacks do appear to be a very well thought out replacement.

Quick Look may or may not turn out to be amazing. It really depends on what documents are supported and how easy it is for third parties to build their own lightweight plugins (e.g. if I can preview 3d models from, say, Cheetah 3D via Quick Look, that would be great, but how likely is that?) Quick Look is eminently hackable though -- write a Quick Look plugin to do screen casting, for example (since it's unclear whether that functionality is available in iChat AV as implied).

The DVD player functionality looks like a really compelling feature, especially for the Mac Mini as home entertainment center. At last, one of the two most annoying things about DVDs (skimming through them to find something) appears to have been clobbered. Now all we need is a MENU button that can bypass ads.

Spaces looks like it will be amazing. I've got a license to Virtual Desktop somewhere (one of several free and shareware virtual screen apps for OS X) and I gave up using it long ago. For something like Virtual Desktop software, incredible attention to detail (like perfect Exposé integration and muting games in hidden screens) is essential, and this is where Apple can make a great concept that doesn't quite work available to everybody.

Again, the devil is in the detail. Yes, Vista has automated backups. So does the Mac. Do you think that this is the same as Time Machine? It's like when Apple added outline font support at OS level back with System 7 saying "hey, Windows has fonts too".

"Web Apps Are Not Applications" Rogue Amoeba



Some developers aren't terribly pleased by Apple's announced option for those wishing to develop iPhone applications.

The original post is simply sarcastic, but this response (strongly agreeing with the original post's sentiment) sums up the poster's point of view:

Apparently if we want to develop for the iPhone, we have to be web developers, and develop web apps. Saying we can develop "Web 2.0 apps using AJAX" is just a nice way of saying "No 3rd party apps and no 3rd party widgets."


Just like if you really want to develop Cocoa apps, you can't write them (easily) in Logo, Visual Basic, C#, or Pascal.

They're right, of course, Web Apps aren't Applications.

  • They don't need to be installed
  • Or kept up to date
  • Or moved from machine to machine when you suddenly need to go on a road trip
  • Or uninstalled when not needed
  • They don't support multiple users either (a) not at all or (b) as an afterthought
  • They can't crash the machine they're running on, only the browser
  • A rogue web app can't format your hard disk, or turn your iPhone or computer into a bot
  • They can be written using a huge variety of tools and languages, many of which are childishly simple to learn
  • "Hello, world" is only a few bytes longer than the ASCII string. There's no 20MB .NET runtime.
  • They don't need to be recompiled to run on different platforms, although they do need a little tweaking.


To allow third party development for the iPhone Apple needs to provide a development and runtime environment that:
  1. is safely sandboxed so that third-party apps can't compromise the iPhone's stability,
  2. has the power to communicate with central servers, and
  3. has some kind of mechanism for distributing and updating itself
  4. and has all the usual capabilities of handling user interaction, drawing pictures, and so on

Safari has all of these things. It runs on HTML, CSS, JavaScript, and Flash, which can be generated by server code written in any language you like, including C++, LISP, Cobol, and Eiffel. Go for it.

Now, Apple could build this from scratch or it could use something that already exists. Since Apple doesn't have, say, a managed code environment like .NET to throw at the problem, the other glaringly obvious option is Web 2.0 etc. Which is what they picked. Sure, this limits what you can do in your application ... I don't think anyone has written a 3d animation package in JavaScript yet, so I guess that's going to be a stretch.

Don't want to sully your hands with Perl -- fine. Code your server in LISP or C++. I don't care. Neither does the iPhone.

Now there are legitimate concerns vis-a-vis the iPhone working when disconnected from the internet, or in low bandwidth situations. Will it be possible to (a) load a "website" onto your phone and run it as a local app (possibly with some local runtime support, such as Apacha/PHP/Perl or whatever?); (b) can you load a page or pages into your cache explicitly and always have access to them? These are perfectly legitimate questions for which I suspect there are good answers.

But whining about being forced to learn HTML/CSS/JavaScript or whatever is just dumb. If you can handle Objective-C, you're not going to have any problems building web applications.