Welcome to my home page. I became blind at birth. I started programming computers at a young age. I also earned my general class amateur radio license, KA3TTT, a hobby to which I have returned with great joy. I practice Qigong and consider myself a Taoist. I use Linux as my desktop and Android as my mobile OS. I eat gluten-free vegan meals. For the rest you'll have to read my blog. To comment on what you read here, visit Disboardia, my bulletin board system.
I just took part in an email interview with the Apple Fancast for their segment the Rounded Rectangle. They asked me a series of questions. They did the same with another accessibility expert named Steven Aquino,, a low vision user. I had a lot of fun and got out some good information. Enjoy.
I finally got around to writing my review of the Apple Watch and its accessibility features. When I first got it, I deliberately held off, figuring that anyone could write an article saying that the Apple Watch rules. I wanted to write something more thoughtful. After two months wearing the watch, my opinion has not changed, and I should have just written it then. Apple has described it as their most personal device and I agree. They have created the first accessible wearable and I love it.
When Apple first announced that they would come out with a watch, the blind community immediately became interested. Would it have accessibility like all of their other products? Some said yes. A few said no. Others said maybe not at first, but eventually. In October of last year Apple released WatchKit as part of iOS 8.2. I wrote an article showing that WatchKit contains methods to set accessibility attributes. In March of this year I gave a talk at Philly Cocoa summing up what we knew then. Apple still had not released any details about VoiceOver, and the blind still made an anxious noise. Fortunately, and not surprisingly, when the watch did come out it had a full version of VoiceOver much like that on the iPhone.
On April 13, the Monday after the Apple Watch became available to preorder, I went to the Walnut Street Apple Store, a place with which I have become familiar. They setup an individual session with me in the briefing room. Sadly, at the time they only had demo units which did not have VoiceOver enabled. They just ran a dumb demo loop over and over. The Apple employees tried everything they could think of to give me access, including opening the precious safe to check the Apple Watch Edition. I felt a little bummed out that I couldn’t try VoiceOver, but it did give me a chance to try them on so at least I could order one. I kept coming back to the 38 mm Apple Watch Collection with the milanese loop. The metal mesh has an intricate feel, and it makes a clever use of a magnet. I ordered it as soon as I got home. My friend at the Apple Store said that Apple wanted to under promise and over deliver.
Even though I ordered the watch I still wanted to try it, so emailed a friend in Apple accessibility. She told me that the Apple Store would have fully functional units for accessibility testing in two weeks. Sure enough they did, so two weeks later I returned. VoiceOver worked exactly as I had read and thought, borrowing the one and two finger gestures from the iPhone and adding in a few more. Now I really wanted mine to arrive.
Another customer had a demo in the briefing room at the same time, and I couldn’t help but overhear that they would get the Apple Watch Edition out of the precious safe. I asked if I could try it on and they said yes, even though I made it clear that I did not make enough money doing accessibility work to afford the $17,000 price tag. It felt just like the stainless steel Apple Watch Collection which I ordered, except slightly heavier. Honestly as a blind user I had to laugh, I just didn’t see the point. It has the same internals! It does come with a classy leather jewelry box with an inductive charger built into it and a beautiful tactile Apple logo on the top. Maybe one day.
On May 6 I got a notification that UPS would deliver my package. Time slowed to a crawl as I waited for something to happen. Nothing did. At some point in the early evening I checked one last time. UPS claimed they made a delivery attempt. I felt enraged. I had waited all day. As it turns out they had a problem getting into the building, and left the delivery slip. I got my Mom to help me print out and fill out the form on Apple’s site to sign for the delivery, and we put it where they’d hopefully find it. I began having flashbacks to my iPad 2 delivery.
I had nothing to worry about. It arrived the next day as I made dinner. I didn’t care, I set it up immediately and sent my Mom a text. Everything worked out of the box. I had my Apple Watch!
At first I worried about the battery level. It reached 10% close to the end of the first day. Then I read some tips to save battery, many of which also apply to iOS. Turn the screen brightness down to 0%, turn on screen curtain, turn on the reduce motion and grey scale settings in Accessibility. Now my battery usually goes down to a little under 50% at the end of a day.
People ask if it replaces my iPhone and I say no. I use it for doing quick tasks such as checking a notification, or quickly replying to a text with “Ok.” or a simple dictation. Sometimes I will take a quick phone call and feel like I’ve entered the future. I often check the stocks or weather. I also use Siri a lot more on my watch than on my iPhone, since you kind of have to. It also feels cool to open the Uber app, hitting the single “Request” button, and having a car pull up in a few minutes that I ordered from my watch.
I enjoy tracking my workouts. I do yoga, as well as the workouts from BlindAlive. The watch tracks my calorie count and heart rate. I also have my activity right on my watch face, and seeing “Exercise, 0%” often goads me into action. I do which they’d allow setting the hour at which the day begins. Currently I work out and go about my day, but then at midnight it magically resets like Cinderella’s pumpkin turning into a coach. I feel pretty certain a few Apple employees have stayed up past midnight. I recall stories of them wearing shirts that said “80 Hours a Week and Loving It” in the Steve Jobs era.
I really like the haptic feedback. In Apple Maps, you can get directions right on your watch. This works very well for the blind, since we don’t have to keep getting out our phone and fumbling around while walking. When I tried it, it did have some issues syncing with the phone, but that has nothing to do with accessibility.
Using the first edition of the Apple Watch reminds me a lot of using the first iPad. It works well enough and proves the concept. Sometimes it lags or gets confused, but over all it works. When the iPad first came out a lot of people felt unsure about its purpose. Some made crude jokes. Then when the iPad 2 came out, it had a thinner design and snappier performance. No one laughed. I think the same will happen with the Apple watch.
Someone might wonder why you’d need a watch when you could easily take a small phone out of your pocket. For some reason having something worn really does make a difference. It does feel very personal, and this extends to accessibility. For example, when you earn a workout achievement, it actually says “A shining achievement award rotating into view.” When meeting for my demo I said that Apple has created the first accessible wearable. “You know, when you say it like that it sounds really big.” said an employee. “It IS really big!” I exclaimed. Nobody else has made an accessible wearable, but Apple has, and it works beautifully. I love my Apple Watch!
I recently celebrated Towel Day, a day to honor the life of Douglas Adams, author of the Hitchhiker’s Guide to the Galaxy. The book sells so well because it has “Don’t Panic” on the front in large friendly letters. It inspired me to make a Don’t Panic alarm for my Mac. Sorry I don’t know how to do this in Windows, perhaps someone can leave instructions in the comments.
During one part of the radio drama, Zaphod has to get out of the Hitchhiker’s building in the middle of a bombing. An alarm sounds and starts saying “Don’t panic.” I extracted this, put it in a .wav file, and installed it. It has already helped diffuse a few situations.
First, download this file. Next, move it into your ~/Library folder. Finder hides this by default, but just type Command-Shift-G and type ~/Library. Next, copy it as you would. Assuming you have it in your Downloads folder, the following command in Terminal will also do it:
mv “~/Downloads/Don’t Panic.wav” ~/Library/Sounds
Once you have moved the file, open your system preferences and select Sound. Choose the Sound Effects tab. You should then find Don’t Panic in the table of alert sounds.
Share and enjoy!
A few weeks ago I participated in EvoHaX, an accessibility hackathon which happened as part of Philly Tech Week. Ather Sharif of EvoX Labs did a wonderful job organizing it. I had other commitments during the main coding day, so we compromised and made me a judge. I also gave a little speech poking fun at their prize of a Google Chromebook. I enjoyed the experience and feel glad they have already said they will do it next year.
I find it funny that I have helped plan two accessibility hackathons, but have not written a single line of code for either. I’ve had other accessibility-related commitments. Last year I spoke at the annual RubyMotion developer’s conference, and this year I gave a workshop at the University of the Arts as part of our new business called Philly Touch Tours, more on that soon. I met with Ather and the other planners and we went over the whole event. Ather had an interesting idea to pair groups with a random subject matter expert, in other words a user with a disability. This mirrors the real world – you never know when you will suddenly have to face a challenge.
On Friday April 19th the event began. Benjamin’s Desk hosted it. We listened to several informative speeches. One professor specialized i rendering infographics for screen readers. A cool topic for sure, but he kept asking “What do you see in this graph?” I wanted to yell out “Nothing!” In my head I heard my high school geometry teacher saying “You’re not much of a visual learner, are you.”
On Sunday I rolled in for the judging. I met the other judges and experts. I also saw my friend Faith Haeussler and her very cute kids who know the word hackathon. Everyone had finished coding. I got out my MacBook Air and prepared to begin.
Before I continue I have to explain something which it seems a lot of people don’t know. Blind people tend to not use Google products. Google has become synonymous with second rate accessibility. iOS dominates the mobile and tablet space. None of my blind friends use Droid, and I mean that literally. Zero! For the desktop we use Windows or Mac OS and their respective screen readers. I don’t know anyone who uses Chromevox. Personally I use a Mac with VoiceOver and Safari for my browser. When designing something for the blind you must remember the platforms used by the blind.
Because of this, I couldn’t get over the prize of a Google Chromebook for each member of the winning team. It really depressed me. For a few days I lay around, lamenting that I would have to participate in an accessibility hackathon that gives away Google Chromebooks as prizes. The world will end! Then I pulled myself together and remembered that the prize doesn’t really matter, all the wonderful inspiring work does. This gave me a great idea for a speech. I composed it in my head as I waited to judge the entries.
First up, West Chester University wrote a Chrome plugin called Jumpkey to easily navigate to common places on a web page, such as the home or contact links. Interesting concept. They brought over a MacBook Pro running Chrome with Chromevox, which I had never used. It started talking in a goofy Google voice which made me laugh. I figured out a few keys and the plugin worked. One of the authors told me he could port it to Safari in an hour. I hope he does.
Next Lasalle University demonstrated their project, a browser framework called Blind Helper. They admitted they needed to find a better name. Fortunately this one worked with Safari. They designed a system for crowd sourcing image descriptions and rendering them as alt tags. I liked the idea, and the demonstration worked. However, their logo didn’t have an alt tag, and the form fields did not have labels. It struck me as rather ironic. When coding an accessible platform you should make the platform accessible! They lost a point or three for that. Still it has potential.
Next, an all female team of hardware hackers from Drexel stole the show with their speech reader bluetooth module. They designed it for those with cognitive difficulties, but it has other uses as well. They used an Arduino with some other components. They even tested it with NVDA, the popular free screen reader for Windows. Excellent!
St. Joe’s presented a browser plugin for those with dyslexia to place icons next to familiar words. This helps their brain figure out the proper word by giving it some context. They could even make it multi-sensory. I couldn’t use it so couldn’t really comment, but I like the idea.
Finally, Swarthmore College presented a visual data representation of the Youtube videos which have captions, or rather the lack thereof. I couldn’t see the graph but they could render it in textual ways. I also grew up in Swarthmore so wished them well.
To vote, EvoX Labs wrote a little web app for the judges. And yes, they made it accessible. I filled out my form and Faith read the results. After congratulating everyone we made speeches. I called mine The Accessibility MacGuffin. A MacGuffin refers to an object which drives the plot of a story. The object itself doesn’t matter, the story around it does. For example, the briefcase in the movie Pulp Fiction doesn’t really matter. We never know its contents. We only know that some gangsters have to retrieve it and protect it for their boss, using some rather extreme means to do so. This graphic scene demonstrates the power of a macguffin. Pay attention to the briefcase!
I didn’t know how people would feel about making fun of the prize, but it went over well. I hope the participants will think about accessibility in all their projects. I also hope they continue developing the projects started at EvoHaX. See you next year! Maybe I’ll actually get to write some code.
As I detailed at great length, I have FIOS, Verizon’s fiber optic internet service. I have never liked using stock firmware mainly for accessibility reasons, but also because we just don’t know what it may contain. For a while I used Tomato, but have decided to switch to OpenWRT. I thought I’d have an easy time porting over my network’s settings and that I could then continue on my merry way. I thought wrong.
I use AirPlay to stream music around my condo. It has worked well, but I started getting increasing amounts of dropouts. I decided to upgrade my router to use 5 GHZ since it has less interference. Plus it could hurt to run newer firmware. After some research I purchased the TP-Link Archer C7 because it supports 802.11AC and has four ethernet ports. I still like good old wired ethernet when possible.
The device arrived quickly and I went to work setting up OpenWRT. I went to the router’s stock firmware’s default address of 192.168.0.1. I found the upgrade button. A dialog box popped up: “Are you sure you to upgrade?” Yes! Definitely!
I installed OpenWRT without incident. I setup my password and some basic settings. I enjoyed using the command line to do everything. Then I got to the part where I set up my internet connection and hit a dead end. As it turns out, Verizon FIOS has a weird issue with their DHCP lease, and some other nasty surprises. I reproduce the following instructions in hopes they will help another. I also hope they integrate this fix into the next release. This works with Barrier Breaker 14.07.
Firstly, and most importantly, you must have the MAC address of a currently working router. Copy this down before continuing.
Download the version of OpenWRT appropriate to your router. In addition, download the patch utility, which you find in the packages subdirectory. It will have a filenamename like patch_2.7.1-1_r71xx.ipk. You will also need the patch from this forum thread. Remember to get the latest patch, the thread has several. As of this writing it has the filename dhcp.sh-150411.sh.
So just to recap, at this point you should have a working MAC address, the version of OpenWRT appropriate for your router, the patch utility, and the patch from the forum thread. Install OpenWRT as detailed in the wiki and stop when you get to setting up your internet.
Now use scp to copy the patch utility and the patch itself to your /tmp directory.
$ scp patch*.ipk dhcp.sh*.patch firstname.lastname@example.org:/tmp
Now ssh back into your router and apply the patch:
# cd /lib/netifd/proto
# patch -p0 -b /< tmp/dhcp.sh-150411.patch
The patch should succeed. If it fails restore the backup file. Assuming that worked, edit your /etc/config/network and configure your WAN:
config interface ‘wan’
option ifname ‘eth0’
option proto ‘dhcp’
option macaddr ‘MM:AA:CC:AA:DD:RR’ # replace with the MAC address you copied earlier
option clientid ‘noc’
# Use alternate DNS servers so Verizon won’t spy on you
list dns 18.104.22.168
list dns 22.214.171.124</p>
Save and run:
# /etc/init.d/network restart
and hopefully everything will work. I went through so much to get this working that I had to write it down. Requiring a working MAC address really threw me for a loop. I even used the one from my previous third party router, and it still worked. So there you go, enjoy blazing fast FIOS on your awesome new router.</span>