Monday, June 04, 2007

OMG iPhone SDK!!!

Well the Mac News & Rumor sites are abuzz with news of a potential iPhone SDK at WWDC 07. Personally I never know what to trust when these types of rumors come out. I do believe though that there is usually an element of truth behind the rumors. Since writing my original iPhone SDK analysis shortly after Steve’s keynote i’ve had some time to reflect on this and I must say I still stand by it. To quickly recap here are some observations I made
  • There are no Windows in the iPhone, the entire system appears to revolve around presenting and animating views
  • The sheets that animate onto existing views appear to only come from the bottom-up as seen in the keynote
  • all views and layers appear to have the ability to become partially transparent including the toolbars and topmost statusbar
  • The iPhone shows the same cocoa controls we know with some sporting some theming to blend in with
  • the iPhone UI, these controls include NSButton, NSSegmentedControl, NSSearchField, NSToolbar, etc
  • The keyboard sheet will probably be a standard sheet you can call for input from any app, gone will be the days where we simply assume users can type something at any time
If the iPhone SDK does actually show up at WWDC then many questions developers have been curious about will finally be answered. Apple likes to keep their apps short and simple, especially on the iPhone so it makes it difficult to try and predict the full range of such a SDK. John Gruber thought that the iPhone SDK would simply take time to document before it could be released to developers and thus the comments made by Steve Jobs are simply just irrelevant and hide the fact that they may still be preparing the SDK for public consumption, a thought seconded by boredzo. Which makes sense to me, after all the reporters who managed to get some play time with the iPhone after the keynote stated that at some points they thought they broke the iPhone because no controls responded and found out that what they were looking at was only an image instead of some finished apps. Even if software was finished there it would still take some time to document to any decent degree. Something Im surprised that’s never been mentioned anywhere is a good theory for how to develop software for the iPhone. After all gone will be the days of a single toolbar, one mouse, a floating palette, etc. This isn’t Windows Mobile, this is Mac OS X Mobile dammit! Where are the big discussions at? Even in the absence of a official SDK, we could all make some general assumptions and guidelines with the views, and buttons we saw in the keynote alone. Maybe a better question is that if the Windows Mobile based phones suck so much what can we learn NOT to do from them? Unfortunately this is something I feel I can’t adequately answer here as i’ve never used a smart-phone with windows mobile yet, the best phone I’ve owned so far is my current Motorola RAZR V3m that’s been locked down by verizon. Thankfully it syncs with iSync, or I should say did. It now experiences bugs syncing and forces me to go from syncing once a month to once every couple weeks due to a stupid calendar issue of some sort. In any case every phone I’ve ever used has utterly sucked in my opinion. Entering alarm times, looking up events in my calendar all utterly suck and the fact that Verizon has locked it down doesn’t help at all. Another thing that people haven’t explicitly discussed is the impact of some particular features in the iPhone on the SDK. If the iPhone has Core Animation then it should be generally assumed that it’s based off of Mac OS X 10.5 Leopard unless apple did a major job of back porting Core Animation to 10.4 Tiger which I very much doubt. So that also means we also have garbage collection which makes a lot of sense for the iPhone as it won’t exactly be an expandable environment. Some developers have been wondering about how we handle the multiple mouses thing. Personally Im not worried about simple operations such as zooming in views. I think that may be handled by simply having your view do drawing triggered by delegate methods. What Im wondering is things like how we track the changes on the UI by 2 mouses, like if we have 2 sliders and the user decides to operate on both of them at the same time. Or can I use 1 mouse to let a user browse a list and another one to trigger that list to scroll faster/more slowly,etc? A point brought up by comments on the previous iPhone article thought several things might happen for the developer tools like anything from a whole new Xcode to just a simple iPhone emulation app. Personally I’m going with the 2nd option. I think it wouldn’t be in Apples interest to create a whole Xcode for the iPhone, but rather to create an iPhone project category and to possibly limit what you can do in Interface Builder under this special mode. Lastly one BIG thing nobody’s mentioned is that this SDK may be very limited at first. If the SDK is presumed to work with Leopard features, then that should limit running the iPhone SDK to Leopard as well under a...oh say new beta of leopard distributed at WWDC. Overall I’ll say this. I don’t think it’s outside the range of possibility to see an iPhone SDK introduced at WWDC or at the very least to see one announced and shipped later on after WWDC. I have the day off work for Steve Jobs keynote and unfortunately I am not going to WWDC so I will have to watch the feeds same as many of you.

4 comments:

  1. Anonymous3:31 PM

    Very interesting articles.

    I'm a Windows Mobile (WM) programmer, not by choice. I am also a decades long student of GUIs (and have written some).

    You are the first Apple person I've noticed that even tried to study the iPhone GUI. Most people got taken up in the multi-touch hyperbole and failed to notice that was an extremely minor part of it.

    I noticed right away that starting an app zooms it into place. Thereafter, _usually_ common dialogs come up from the bottom (or top). Choices almost always cause a slide-in of the next screen from the right. Also, most screens have Back buttons to go up one level, and often (with a tap) will show even more buttons for actions.

    The main diff between WM and iPhone is that WM is geared towards the user downloading lots of 3rd party apps. It maintains the Windows Start Menu paradigm, which is not beautiful. But most of all, its Start Screen is geared towards PDA users, showing your appointments, calls, etc. The iPhone's main screen is a menu instead.

    The submenus that show up with a tap on the iPhone, but which then take up space, show up as temporary context menus with a long tap on WM. (The equiv of a right-click.)

    I suspect iPhone apps won't really get dual mice inputs, but rather a "pinch/zoom" message. But I obviously could be wrong!

    I see both GUIs evolving towards each other. Gotta run for now.. dinner.

    Cheers, Kev

    ReplyDelete
  2. Anonymous6:29 AM

    Btw, anyone thought about how they intend to develop and test iPhone apps without a Cingular account?

    Simulator? Dummy non-phone device?

    Also, you should really get a touchscreen monitor for development.

    Kevin

    ReplyDelete
  3. I wasn't quite seconding his thought; more just making a point about Apple's spotty track record of documenting APIs. My opinion on the iPhone SDK issue is that AT&T doesn't want them to make one because it threatens their ability to sell airtime and SMS.

    ReplyDelete
  4. Anonymous7:40 AM

    I would think that "multiple mouse events" would simply be handled as seperate events by the event manager, so the programmer wouldn't have to do anything to let a user interact with multiple controls. Apple also should include an integer with each touch event as well as a method to find out how many touch events are active at a given time. This way the programmer could write their own custom behaviors; although the zoom feature is likely already provided by the frameworks, it would be nice to invent new ways of interacting with applications.

    ReplyDelete