Welcome to my home page. I became blind at birth. I started programming computers at a young age. I also earned my general class amateur radio license, KA3TTT, a hobby to which I have returned with great joy. I practice Qigong and consider myself a Taoist. I use Linux as my desktop and Android as my mobile OS. I eat gluten-free vegan meals. For the rest you'll have to read my blog. To comment on what you read here, visit Disboardia, my bulletin board system.
Ever since I held an iPad 2, I have pictured it as some kind of supermodel. It just has such a sleek and beautiful feel to it. The proportions feel just right. The curves feel perfect. Apple made a beautiful machine.
Unfortunately, the various covers just did not seem adequate. Apple’s solution, the Smart Cover, offers minimal protection. One article compared it to putting a supermodel in a wetsuit. In a way I appreciate its minimalism, and could imagine some of what a sighted guy must feel when looking at a similarly attired woman. Still I knew I had to find something more formidable.
I wanted something classy. I wanted a nice keyboard. I wanted something that wouldn’t add too much weight or bulkiness. I wanted something that would compliment the elegant contours of the iPad’s beautiful body. Even though I knew what I wanted, I didn’t know what to actually buy.
First I tried the New Trent Keyboard Case. Friends love their New Trent battery packs, so I thought I’d try it, plus it got good reviews on Amazon. Unfortunately it just did not deliver. The keyboard and cover clumsily snapped together, and the whole thing gives the iPad a bulky feel. Worst of all, it has an extra Delete key which makes VoiceOver freak out. It acts as though the user continuously holds it down in any input field, instantly deleting any entered text and making a bonking sound. Not good.
Even though I hadn’t found my answer, I had realized that a keyboard definitely compliments a tablet. I went back to using Apple’s bluetooth keyboard. In a turn of events, at the 2012 WWDC, Apple announced a new iPad case with cover. It promised complete protection for the front and back. I thought that in combination with the keyboard might provide a solution. I thought wrong.
Where Apple’s smart cover seems like putting a supermodel in a wetsuit, their iPad case seems like putting one in a baggy swimming suit. They had to make it fit the iPad 2 and the iPad 3, so on my iPad 2 that extra room feels noticeable. I felt a little disappointed about this. Apple loves minimalism, and should have seen this. I don’t have much else to say, the rubber covers the back and the cover flips over the front. After some playing I returned to just using the smart cover. Why not?
It sounded exactly like what I wanted. It features a full keyboard, and a metal back which compliments the iPad. The cover snaps onto the iPad with the same magnetic hinge used by a smart cover. After reading some good reviews I decided to give it a try. I love it!
The keyboard cover seems like the perfect compliment to the iPad’s beauty. When closed, the two pieces feel like one unit. An unaware person would probably not even know what the held. It turns it into a whole new piece of technology, and changes the way I use the iPad. I don’t even know what you’d call it. Some call it a netbook, but that doesn’t seem quite right, more like a netbook with a giant touch screen. Welcome to the future.
It has totally changed the way I use my iPad. Now I can just open it and browse the web, or read Twitter, or read mail, secure in the knowledge I can easily type something. Apple has a very strong vision of a tablet without a tactile keyboard. I can understand it, but I don’t fully agree. For me, a tactile keyboard compliments a tablet beautifully. Unfortunately Apple will probably never make a keyboard cover, so we will have to find third party solutions. The Logitech Keyboard Cover makes a beautiful thing even more beautiful.
In January I wrote an article detailing why Twitter needs to care about accessibility. Unlike other apps, the Twitter app has become tightly integrated into iOS. This means that it should follow the same strict accessibility requirements as Apple’s core apps. Fortunately, they have made their iOS app mostly accessible. Unfortunately, their Mac OS X app has zero accessibility, and Mountain Lion has begun integrating it.
Since its beginning, the blind have enjoyed Twitter because of its accessible format. Twitter just has text – no images, no stupid like buttons, just pure text, just the way we like it. A number of wonderful clients for iOS, Mac OS, and even Windows have sprung up. This ideal situation may not last forever. Disturbing rumors have surfaced that Twitter may stop allowing third party clients. I really hope this doesn’t happen. Twitter would feel the backlash. I might even stop using it. I might not have a choice.
Apple just released their latest update to Mac OS X, named Mountain Lion. People have generally received this update much more positively than Lion. Among other things, it has Twitter integration. This allows you to associate Twitter identities with your contacts. You can also tweet from notification center. Just go to twitter.com in Safari and sign in. It will ask you if you want to use this user name on this Mac. Say yes and you will receive notifications about mentions and direct messages.
To update your contacts, go to System Preferences then the Mail Contacts and Calendars pane. In the table of account types, go down to Twitter. A button will appear which says Update Contacts. This will associate the emails in your contacts with Twitter identities. It works very well, and integrates Twitter in a rather amazing way. Interestingly, I missed how many contacts it had updated. I typed “twitter” in the contact’s search field, and it pulled up every contact with an associated Twitter identity. I guess it must also search on attribute keys.
Now we come to the bad news. If you go to the contact, you can bring up their tweets. Or so you’d think. Sighted users should right-click on Twitter. Blind users should route the mouse with VO-Command-F5, then click with VO-Shift-Space, not that it will matter. You will see a menu with a option to tweet, and an option to show tweets. This opens the official Twitter app.
And now we come to the point of this article. The official Twitter app for the Mac has zero accessibility. I don’t mean a little, or enough to get by, I mean nothing. VoiceOver shows a close button, a minimize button, and a zoom button. And nothing else.
To reiterate what I said in my other article, if Apple wants to use the official Twitter app, then it must meet the same accessibility standards. This needs to happen now. Dark clouds have begun gathering on the relatively perfect Twitter horizon. Perhaps it has become a little too perfect. Will Twitter remain the cool social network? Or will it descend into becoming another monolith? Only time will tell. Even the smallest person can change the course of the future.
I drank beer in college. I learned C in college. After I graduated I stopped doing both. Now I have begun using C++ and drinking beer again. Coincidence?
I started college in 1995 to get a computer science degree. I remember sitting in my dorm room the first night. Someone offered me a beer and I drank it, feeling sort of independent for the first time. Of course, like most college kids, I had some bad experiences with alcohol. By the time I turned twenty-one I had stopped drinking.
At college I also began learning Ansi C. To briefly explain, in the early seventies, some programmers at Bell Labs created the C programming language. They wanted a small and powerful language which could run on a wide variety of computers. It quickly became popular, and remains in use to this day. This eventually became standardized as Ansi C.
To me C felt clunky and wrong. Nothing I wrote ever seemed to work as I had envisioned. I could program in several other languages, but for some reason I just couldn’t quite get the hang of C. Having someone else teach it to me could have something to do with it. I tend to learn better on my own. And yes, alcohol may have also played a part.
I have a funny memory of having rather a lot to drink and going on IRC. I started chatting with a random guy who it turned out also took a C programming class. He had trouble with some things which I understood, so I offered to write the program for him. I told him he just had to do me one favor: to remind me to get water every ten minutes or so. The arrangement worked out and I wrote the program. I don’t remember much else.
A year or two later I stopped going to college, programming in C, and drinking alcohol. I felt fine with my choices. College sucked. C confused me. Alcohol made me sick. Then, a few months ago I moved into the city. I have come to the conclusion that one has to drink beer to live in Philadelphia. We have lots of microbreweries around, and Hawthorne’s Cafe has the best beer selection in the area.
I first realized this at a block party. Let me tell you that the block parties around here blow the block parties in Swarthmore away. They had cordoned off a street right by the cafe, and everyone hung around listening to an American wanna-be reggae band in the lovely May sun. A friend asked if I wanted a beer. I said I didn’t really know, but he works at a beer distributer and said he’d find me one I’d like. He gave me Daisy Cutter. I really enjoyed it, and it even gave me a little buzz. It made me begin to rethink my view on beer.
Meanwhile, I realized I needed to get comfortable with C on some level. It has some real advantages. It allows one to easily compile a program to provide a simple executable file anyone can run without anything extra. It also allows for easy linking with common libraries of code. I began trying to figure out how to pick up where I left off. I hoped that teaching myself would make it easier.
I quickly realized that C++ had become very popular. In the late seventies a new type of C began to emerge. It used principles of object oriented programming, something I’ve thought about before. It makes coding real-world problems easier, since it allows one to create a data structure with functions associated with it. In C, the ++ operator increments a variable. Thus, C++ suggests incrementing the language to the next level.
Everything came back to me, plus now I had the added benefit of understanding objects from other languages, especially Ruby. It feels like they took all the things that annoyed me about C and did their best to fix them within the confines of the language. I think they have done a good job. For example, Ansi C didn’t even have a dedicated way to deal with strings of characters. It saw them merely as an array or list of individual characters, and had functions to operate on these arrays. C++ has an object class for strings, which includes all the luxuries of more modern programming languages. Vectors and maps give dynamic arrays and hashes. This means less mucking around with pointers, something I welcomed. Now I have a grip on C++ and I feel good.
In just a few months I have rethought my views on C and beer. I prefer good beer, and I know not to drink too much. I don’t want to get too messed up. I prefer C++, and I know not to use old Ansi C ways of doing things too much. I don’t want my programs to get too messed up. It would appear that programming in C and drinking beer have a positive correlation. And for those who might feel shocked that I would reinvent myself like this, I can only say: All hail Discordia!
I became excited about the Raspberry Pi as soon as I heard about it. I had no idea if a blind person could even put one together, but I ordered one anyway. I received it, ordered some other parts, and to my delight I got it working. I feel like a kid playing with my Apple.
The Pi comes as a circuit board with the components soldered on it. As of now you don’t even get a case. The board has 2 USB ports, ethernet, audio and video out, a Micro USB for power, and a SD slot for a SD card to hold the root filesystem. It has 256 MB of RAM and runs at 700 MHZ. Think of it as a computer with the spirit of the eighties, the power of the nineties, and the vision of the twenty-first century. Delicately holding the board in my hands I thought of the movie Pi: “This suitcase isn’t filled with money, or gold, or jewels… just silicon.”
I do not consider myself a hardware person. How many programmers does it take to change a lightbulb? None, that’s a hardware problem! Siri told the same joke at WWDC. Great minds think alike.
I felt a little freaked out at the idea of buying other components and hooking them up to a circuit board. The official store only had European power supplies, and I live in the States. I put out the call on Twitter for the best power supply, and a guy named Andy wrote back. We settled on the Amazon Basics Wall Charger. It has a 2.1 Amp output, more than adequate. I added a Micro USB cable. It arrived right around the same time as the Pi.
I now had all the pieces. I had the Pi, the wall charger, the cable, and an SD card. They have a number of images for downloaded. They recommend Debian, but since I have a rebellious nature I chose Arch Linux. It doesn’t have a GUI installed, perfect for my needs. Plus I already run it on my desktop so know it well. Its minimalist philosophy lends itself well to this project. I followed the instructions for Mac OS and soon had it ready. The moment of truth had come.
I gently inserted the chip into the horizontal slot. I gently plugged in the Micro USB connecter. I gently plugged in the ethernet. I attached the USB end of the cable to the charger, said a quick prayer to Goddess, and plugged it in. I had a horrible feeling the whole thing would go up in smoke with a loud bang and funny smell, but no explosion came.
After my heart calmed down I decided to take the next step. I had already loaded up my router’s DHCP table, and saw the aIP address register. I used SSH and sure enough I connected as root! I had done it! I played around, updated and installed some packages, and compiled a small test program. THis felt so cool!
The building project had not ended, however. The next weekend a friend brought over some legos, and we made a sweet little lego case for it. It measures 13×9 and has little holes for the ports. Legos have their own nature, and it felt very much in the spirit of the project to play with them. I had to think about layering them, reinforcing, the proper dimensions of the ports, and other very real design considerations. In an artistic touch I added a pyramid on the top. The board actually sits atop the legos very snugly. It worked very well.
I felt pleasantly surprised at how smoothly the whole process went. It also gave me a sense of accomplishment and reminded me of the feeling I had growing up. I started with an Apple II/e and an Echo II speech synthesizer. Apple became the first personal computer accessible to the blind. I have fond memories of it, and still have it in a box.
The Apple had AppleSoft BASIC integrated into the machine. In other words, it had a programming language built into it, and it intertwined with the operating system. When I first got it I only knew how to RUN a program or get a CATALOG of files on the disc. One day I had the bright idea to type LIST. I figured it would list something, and I figured correctly. It listed the source code to the program I had just run, the legendary Eliza. In a flash I realized that computers just followed instructions like this, and understood what programming a computer really meant. I knew then I wanted to do this all my life.
Now almost thirty years later the Raspberry Pi wants to revolutionize things for children in a similar way. They recommend kids learn Python. I’ve never played with Python because whitespace infuriates me, and the concept of it having significance makes me feel somewhat nutty. I imagine having an argument with the interpreter much like the guy in the argument sketch. “Is this the right room for an argument?” “I told you once.”
But that doesn’t really matter. Everyone has unique tastes, and different languages suit different people. Whatever language kids choose, I hope this project will help. It has already helped me get more comfortable with hardware. If you want to learn more about computers in a very hands-on way, then take a byte of the Raspberry Pi! And by the way, to my blind readers, they spell it P-i, like the Greek letter or the awesome movie referenced above.
This article will conclude the series detailing my three-day intensive to learn echolocation. By making a tongue click, a blind person can learn to retrain their brain to activate the visual center through reflected sound. This gives the equivalent of long-range vision. If you haven’t already, you should read about the beginning of what we called Echolocation Woodstock, and what happened the next day. That will bring you up to speed. The last article ended with me lying in my bed, seeing the ceiling above me without even clicking and realizing that I had unlocked something much greater.
Before I continue I wanted to answer some questions. People ask exactly what I mean by seeing objects. I do not mean just hearing a sound, though it starts from that point. I do not mean visualizing something in the mind’s eye, though no doubt it plays a part. When I see something with echolocation, I actually see a dark form like a mannequin positioned around me in space. Echolocation shows an object’s size, mass, material, spacial position, everything sight gives minus color and finer detail. It feels like the most amazing thing to actually see the world around me in this new way. These articles have taken longer to write because I have had to come up with ways of explaining this amazing ability.
Some people have wanted to know how to start learning echolocation. This includes some curious sighted people. In fact, they might have an easier time in some ways, because their brains already have the pathways to process visual information. Start as I did, with the panel exercises I talked about in the first part. Get a plate and hold it in front of your face. Make a “shsh” sound while moving your head around as though looking at the object. Listen for the center and edges. Trace the edges. Find the object’s position in space. Reach out and touch it. Build on that. Then move on to other objects, comparing and contrasting them. At the same time, work on developing your click, and gradually transition from the constant sound to the click. That will give you a good sample. As I wrote before though, you can learn how to look, but you need an intensive to learn how to see. Everyone has unique challenges, and you need the real-time feedback.
Now back to the action. We woke up the next day and had almond croissants and Goddess Cups. We also had our super foods, of course. We discussed what we wanted to do for the day. By chance I mentioned that this building has an awesome roof deck. That seemed like a good idea, so we decided to do some object identification there. After that Justin wanted to go to a department store. I didn’t know how that would go down, but figured we’d cross that bridge when we came to it.
We headed up to the roof deck and started clicking at objects. I suddenly realized that I had done the click slightly incorrectly all this time. First, make a “chch” sound. Keep the tip of your tongue in that exact place. Now quickly bring the center of your tongue down, making a sharp clicking sound. A higher frequency gives more resolution, and this proper tongue position allows for a louder click which broadcasts over a longer distance. In a flash I realized that this complimented my meditation. An energy circuit runs up the spine and down the front of the body, and touching the tip of your tongue to this spot completes that circuit. It all clicked together, if you’ll pardon the pun.
Something else happened in this flash. Justin had me click across the street at a church. He could identify it by its shape. I tried, and had the realization I just described. When I clicked properly, I distinctly heard a longer echo come back from farther away. My brain translated it and flipped! I physically felt my brain reel. When it recovered sure enough I could see something farther away. I couldn’t identify it as well as Justin, but I definitely saw a building-like object across the street with a tree in front of it. I went nuts! I had just seen an object at a distance. Hearing that longer echo did something.
After I recovered myself, we walked around the roof deck finding other things to click at. I liked how Justin never considered any of my guesses as wrong. Rather, he would say “No, but I understand why you’d think that.” This understanding really helped. The brain perceives things correctly, you just have to learn how to interpret the perceptions correctly. We found a power box, and other random things. While looking across the wide railing, Justin told me that when sighted people want to see along something, they have to look up. What? Look up? This didn’t make sense to me, but I tried it and it worked, I could see along the railing. Eventually I found a table and some chairs and we sat down. I needed tobacco. We talked about the next step, and decided to get a cab to Macy’s, a nearby department store. I had no idea what to expect, but I put my faith in Justin and of course in Goddess.
This building started out as a department store called Wanamaker’s, the first department store in Philadelphia and one of the first in America. Anyone who has lived in this area for a long time remembers Wanamaker’s. Unfortunately it no longer exists. Now it has become a Macy’s. I had some idea of the intensity to expect, but had never actually gone there. Justin felt impressed and maybe even a little overwhelmed. Sighted people feel this way too. Nevertheless, we plunged forward.
The whole thing felt like a fun adventure. I learned how to use echolocation to maneuver through paths. This skill comes in handy all the time. We went around all kinds of crazy paths. I also learned how to find my way through paths to a goal, such as an entrance.
The main lobby tripped us out. It has a very high ceiling, large stone columns, and balconies. This provided all kinds of cool sounds and images. We really had fun figuring this out. I could appreciate this in a whole new way.
We started on the first floor by some dresses and handbags. No doubt we looked very weird clicking and tromping around but definitely focused on something. We found our way to the amazing lobby and went upstairs. We saw more stuff and walked around aisles. We ended up on the third floor by beds and cookware. Justin mentioned that Mother’s Day would come up on Sunday, and I had the idea to buy my Mom a gift. We hailed a woman and she helped me pick out a white all-purpose baking dish.
We found our way outside and I called for a cab. They could not come for an hour. We wondered what to do. I eventually decided to go back in to the madness and buy a new wallet. Everything went wonderfully and I felt glad I did it. I would have never just travelled to a department store by myself. Echolocation opens doors.
We finally got a cab home. We had a cabby named Victor who wanted to know more about echolocation. “Are you bullshitting me?” “No we’re serious!” “He’s teaching me how to do it.” We blew his mind. This has happened a number of times since.
I have to tell a quick aside. Every once in a while one or both of my parents would freak out and want me to get a guide dog. I’ve never had any interest. I just can’t turn my will over to an animal. Justin pointed out that you really can’t compare cane and guide dog users, because a guide dog user still has a sighted guide, no criticism intended.
Anyway, people would go on and on about how independent guide dog users look. They would also tell me that a guide dog makes a good conversation starter. “And you know, girls love dogs.” I considered that totally vain. I would not just get a guide dog for that. Now I felt glad that I had found an even better conversation starter!
While walking back to my apartment, something else interesting happened. When walking I would usually hold one arm up a little, as a natural defense. Most blind people do, and with good reason. “Hey, are you holding your arm up?” asked Justin. I said yes, starting to realize something. “Um, yeah, you don’t need to do that anymore.” And I saw what he meant. Now that I had echolocation I could easily tell if an obstacle lay in front of me. As promised, it had improved my pose.
We got home and ate some pizza my mom picked up for us from a local shop. We had talked about the Lord of the Rings, and Justin said that he had never seen the Return of the King. As any fan knows, the extended version rules! If we wanted to watch it we would have to on this last night. I put it on, but Justin quickly fell asleep next to a big salt lamp. I watched it anyway, and as usual had some powerful insights.
Watching that amazing scene with Gandalf and Saruman made me think back to earlier to the roof deck. While walking around, Justin told me to take Gandalf steps. I asked what he meant. “If I were training a six-year-old, I would tell them to walk like a seven-year-old. How do you think Gandalf walks?” We had a quick discussion and decided on stately, with purpose. Walking in this way helps maintain orientation. I had experienced that on the previous day and wrote about it then, how walking quickly helped keep me oriented.
Then the scene with Frodo and Shelob came on. Frodo holds up the Phial of Galadriel and it flashes back to the Fellowship of the Ring. Galadriel says: “May it be a light for you in dark places, when all other lights go out.” It all hit me at once. Echolocation does exactly that! A light to guide you in dark places! And now I finally understood the name World Access for the Blind, the only organization teaching this technique. Before I thought it just meant that they train blind people around the world. Now I knew that it really meant that they give blind people access to the world. I felt illuminated.
The next day I had to rest a little. We wanted to start earlier since we didn’t have much time, but my poor brain just wouldn’t allow it. I pulled myself together and prepared myself for my final ordeal so to speak. Once again we walked to Whole Foods. The route felt more familiar, but I need to work on my street crossings. Justin explained that those blind from Retinopathy of Prematurity tend to not do well at street crossings, because they can often become easily disoriented by loud sounds and sound tracking. True enough. He helped me clean up my cane technique a little and showed me how to get better oriented using echolocation. We made six crossings and I did four of them well. I still have work to do, but feel a lot better off. Justin assured me that things would make more sense once my brain had time to put it all together, and sure enough a few hours later it started to. He left right after that and I felt changed forever.
When we first started chatting I felt amazed that someone could learn this in three days. I asked him multiple times and he promised me that he could teach me the basics, enough to get me on my way. I got more than I could have ever hoped for. I thought I would just learn something cool to help my mobility skills. Instead I learned how to see.
To find out more about echolocation, contact World Access for the Blind. They work on a donation basis. Amounts generally range from $500 to $1500 per day of training. You can pay in one or several sums, or with a monthly subscription. Not a bad deal, considering what you get. I believe every blind person should learn this skill. Don’t wait fifty years for the establishment to catch up. Do it now!