RubyMotion and Accessibility

I just gave a great talk at the Philly Cocoaheads group. We met at the life-transforming Indy Hall. I covered basic iOS accessibility, RubyMotion, and how the two go together. I’ve given serious thought to giving a similar talk at the RubyMotion conference in Belgium. We’ll see how that goes.

It all started last night at my first ever night owl session. Indy Hall has them every Wednesday night. They don’t count against your days, any member can go. They make a great social outlet and a chance to get a little work done. I did both.

I had started thinking about giving this talk, but didn’t know if I could actually do it, or if I should fly across the world to try. I met Mike on my first day there and that got me thinking about Philly Cocoaheads. I wondered if I could maybe persuade him into perhaps giving me a few minutes at the end of his meeting to give it a try. I messaged him on Twitter and he said he’d attend the night owl session. I got there and he found me.

“So what’s the agenda like for the meeting?” I asked, trying to work in my idea. He sort of laughed. “Well it’s pretty loose. Our speaker bailed at the last minute.” Perfect! I told him of my idea and he agreed. The pieces slid together like a puzzle.

I began preparing an outline then and there. I felt so excited. I took a break and wandered upstairs, where I found a girl named Kara cooking tomato soup and polenta. This place just keeps getting better! I ate and talked, working off my energy. I began wishing I could telekinetically teleport my beer from downstairs, then a girl brought it. That made me think about language as a lower form of telepathy. I finished the glass and wondered about more. Then they introduced me to the hard cider… Appleeeeeeeeeeeeeeeeeeeeeeeee?!

I woke up the next day and finished my outline. I also got some tips from some blind developer friends as well as the awesome RubyMotion mailing list. Things really came together. I got an awful cab ride there. The interior smelled like an armpit and I wanted to vomit. I found myself at Indy Hall and had a quick smoke and word with Adam. He had recommended to me a service called Uber. I told him I would have to register for it for the journey home.

I went up to the second floor and found Mike and everyone else just hanging out. I had eaten so declined pizza, though would feel hungry later. I drank water and went over things one last time. I tried to stay calm and mostly succeeded. The meeting began.

They went over some business issues and had a cool show and tell period. One guy showed off some custom layouts he made for images. He encouraged people to publish them on github, which fit perfectly into my talk, as a number of developers have released open source RubyMotion apps. A high school kid showed a sweet jailbreak tweak he made to make the iPhone have a universal QR code scanner. I wanted to bring up the accessibility implications, but didn’t get a chance. Very cool! And great to see someone of that age doing this stuff. Keep it up man, just stay out of trouble.

The time for my talk had come. Mike offered a patch cord and suggested an audio demonstration. I hadn’t thought of that, but liked the idea. I tend to gloss over all this basic stuff, but forget most people have never encountered it before. I got everything set up and began. Listen for yourself.

First I covered VoiceOver basics, explaining just how a blind person uses an iPhone. The iPhone changed my universe as soon as it entered it. I wrote an article which went viral. So many amazing apps make such a difference. I mentioned Color ID, Looktel Money Reader, and BlindSquare as examples, though hundreds more exist. Someone asked a great question about how one would begin, in other words if they just walked into an Apple store what kind of help would they get? I told them of my experiences which I recounted in my very funny article entitled Rejoining the Apple Family. Someone asked if BlindSquare provides contextual information of your surroundings like a sighted person would have, they got it exactly. It also helps you not get ripped off by cabbies.

Next we turned to the exciting world of RubyMotion, which lets you write iOS apps in Ruby. I apologized in advance for pushing Ruby, even though I love the language. I also plugged TinkerLearn, a great course for learning iOS and Objective C. I recounted my initial experience hearing about RubyMotion, like a vision crystalized into physical reality. I covered some things that make Ruby great, and RubyMotion especially. I mentioned some common gems including BubbleWrap, sugarcube, formotion, geomotion, and Motionmodel. I also discussed creating views. We prefer to do it programmatically, which works better for a blind developer as well. To that end the Teacup gem provides a domain specific language for designing stylesheets.

I then brought the two topics together, and discussed how RubyMotion helps accessibility. I started by restating my opinion that I just seem to get more work done in Ruby. It also sounds better with speech to me. I also know plenty of blind C programmers, so don’t take that as a generalization.

Now I got into one of the big points of the discussion. RubyMotion has unit testing, something quite cool in itself. This lets a developer write automated tests on various parts of their app. Not only that, RubyMotion also provides functional tests. This lets you simulate the effect of tapping a button in a view. And how does it know the name of the button? From its accessibility label! In other words, if you write specs like a good developer, you will have to label your buttons properly, the main complaint.

After making more fun of Xcode, I covered a challenge for blind developers. The iOS Simulator doesn’t work well with VoiceOver, so it doesn’t matter which language you use. Either way testing an app on an actual device works much better. This means that you can’t use the super cool REPL which RubyMotion provides. Instead you have to use the debugger. Based on GDb, it works well enough, but the developers intend to create a more friendlier and higher level debugger in the future.

I finished my main talk and opened it up for questions. Someone asked how I used Xcode when I tried it. “Oh, I didn’t try very much.” I responded matter of factly. That got a laugh.

I explained about interacting with items with VoiceOver on a Mac. Xcode has a very complicated layout. I had no idea. Mike asked about build settings. And of course you just add them in your Rakefile, just edit a plain textfile. Nice and easy. He also asked about constraints. Some of the gems I mentioned have begun supporting them, but I don’t know much about them yet. I understand more now though, thanks to the wonderful Cocoaheads.

Someone asked another good question, if I see difference in accessibility across Apple devices, an iPad vs an iPhone for example. I said they behave very similarly, though iPad screens do have more complicated layouts with more going on. The iPhone does give a more streamlined experience. I also forgot to mention Zoom, Apple’s large print magnification software. Someone asked about Siri. I said it works as well for me as it does for anyone, except for me it seems to have happened in reverse. I’ve always enjoyed having my text messages read to me. Now sighted people have finally discovered it.

Someone asked about any egregiously bad apps, in other words really inaccessible ones. My mind went blank. We always have one or two apps in mind and they change over time. For lack of a better answer I said the one I had most recently thought of: theLost Treasures of Infocom. I also recommended Applevis which has ratings.

Mike asked about web sites and I gave my standard answer: skip to content links, good use of headings, and a minimum of Flash and Javascript weirdness. Mike collaborated my statement that if you use standard Cocoa controls and label them properly you’ve dome 90% of the work. I mentioned how the designers of the Color ID app didn’t even intend it for blind people, and felt surprised at that use for their app. Someone asked how I perceived the color. I explained that I can see blurs of color. I can also see by using echolocation, but we’ll save that for another talk. Someone asked about native vs responsive apps with HTML views. I didn’t know for sure, but feel pretty sure in saying that native works better. And that did it for my talk.

My brain felt shot. Beer and tobacco sounded good. We made our way to National Mechanics. I had a bad opinion of this place because we came here after BarCamp and it sounded so loud. This time it had a more tolerable volume. I had a double cream stout, then felt hungry so ordered a veggie burger. I also wanted a beer from Belgium to get that vibe going, so got a Corsendonk Christmas Ale. You Belgians make some good beer!

I felt like a nice ride home plus I had gotten some credit for it so tried Uber. Within minutes a nice black car pulled up and brought me home. I had a very pleasant experience. I’ll take fake vanilla over real armpit any time.

As I sat down to write this article I got another delightful surprise. A guy named Drew mentioned me on Twitter. “Worlds collide.” he said. He referenced Alex, one of the organizers of Indy Hall, so I thought it had something to do with my talk. I asked if we know each other. He said he didn’t attend my talk, but he rememberd playing Barneysplat! I wrote that game in the nineties and it became insanely popular. It always warms my heart when someone else remembers it. It made my night.