Home
Welcome to my homepage. I became blind at birth from retinopathy of prematurity. I developed an early interest in computers and radio. I use Linux, MacOS, and iOS. I have an extra class amateur radio license. I practice Qigong daily. I consider myself a Taoist. I don't eat meat, and have a genetic sensitivity to gluten. For the rest, you'll have to read my articles.
The Beginner’s GUide to Echolocation
November 08, 2012I have some very exciting news. Ever since I started learning about echolocation I wanted a way to get started myself. I made contact with Justin at World Access for the Blind and he helped me on Skype before we did my amazing life-changing intensive. Still, we all agree that we need a way to easily teach the blind about echolocation, or at least give them enough information to get them started safely. We also need to prove to the skeptics that it really exists. Now someone named Tim Johnson has written the perfect book to get you started.
The large print version of the Beginner’s Guide to Echolocation: Learning to See with your Ears sells on Amazon, though as any blind person knows by now Amazon does not care about accessibility. Fortunately, he has also made an accessible version available, so long as you can read MSWord documents. The accessible version costs twenty-three dollars. He also has an audio version available for $37.00. It contains the complete text, plus demonstrations of the different types of tongue clicks, an essential point. This has become the more popular version, and with good reason.
Many colorful quotes decorate the book by such luminaries as W. B. Yeats and Albert Einstein. People who use echolocation pick these lofty heroes for a reason. This skill represents something truly amazing, something which will completely shift your sensory paradigm and move you into a better place psychologically.
He emphasizes the importance of meditation. Simply allowing yourself to listen to the sounds which surround you can help train your brain. I love meditating, and have begun writing a book about it myself. It seems that echolocation activists also share an interest in opening the third eye through meditation. This does not happen by accident. By the way, eating superfoods also helps.
I also liked how the book uses music as a reference. You can practice listening to music as a way to boost the range of your hearing. You use the same skill to sort out signals when doing echolocation. Music also uses a lot of reverberation, and these echoes have some similarities. Understanding how sound and music work will aide you in your understanding of how echolocation works.
The book presents many of the same exercises Justin had me do over Skype, as well as some of the things we did on our first night. As the book points out, everyone perceives echolocation differently, and will have to arrive at their own understanding and ways of explaining it. I liked how he had exercises to do individually, but also ones which require a partner. Having someone else holding the objects introduces an unknown element, something vital for your progress. Every blind person faces unique challenges. You need to push yourself just enough to make small mistakes so you can correct them and grow more confident.
I found it interesting that he suggested opening a car window and listening to the echoes to get a sense of echolocation. It does not give quite the detail of accuracy as a tongue click, but it most certainly works. A woman who taught me as a child reminded me that we would ride in her car with the windows down, and I could tell her about the passing telephone poles. Of course at such a young age I did not think of it as echolocation, but it makes perfect sense. Even a simple exercise such as this will prove its validity.
I felt most interested in the discussion of using the visual cortex of the brain to build non-visual imagery. This sounds like what I experience. When I say I see something with echolocation, I really mean it. I actually see the dark form of an object. For me it also has a strong synesthetic component. In other words, if I click against a glass surface, I will get a cool feeling that reminds me of glass. You have to learn to open yourself to these unique sensations to truly succeed.
The book ends with some recommendations of what to do next. Again, they strongly recommend the three-day intensive of which I’ve raved extensively. Along with World Access, they also list an organization in the UK called Visibility. If you have done everything up to this point, you will have a good background for approaching these organizations for further training. Now the excitement really begins!
If you’d like to see the potential of echolocation, then buy this book and try the exercises. Think about it, wouldn’t you pay twenty-three dollars to begin to learn how to see? THis book will show you just that. For the full experience you’ll need to do an intensive, but this will let you know if you should think about the more serious commitment. In my opinion you really can’t lose. If you can hear then you can see! Go for it!
Links as Language
October 27, 2012Recently I attended a talk put on by the Philadelphia Area New Media Association, entitled <Links as Language: how Hyperlinks are Changing the Way we Read and Write.> I found it very interesting, and it started me thinking about the unique way a blind person perceives the internet.
The event took place at the Wharton School of Business, the oldest and according to many the finest business school in the country. I emailed ahead of time and the organizer of the event said they could accommodate me. I got a cab there and listened to the cabby bitch about his Nutribullet juicer, which gave him a headache. I used echolocation to find my way inside and to the desk, where they got someone to escort me to the room.
I arrived early for a change, and greeted a few people. Pizza and coke arrived, and I accepted some, though beer or water would have suited me better. I set up my MacBook Air and prepared to enjoy.
David Dylan Thomas began by showing a sentence. I asked him to read it aloud since I can’t see it. He did, and continued this practice throughout. This got me thinking about accessibility right off the bat.
A lot of the presentation revolved around the concept of artful linking. Links act like metaphors, and you can use them as an effective writing tool. Linking to something in a clever way delivers a reward. It also makes more sense from an accessibility perspective.
He said that a hyperlink has words underlined in blue. Honestly, up to this point I never knew this. I don’t see the web, I hear it with a screen reader. To me, a link just has the word “Link” or “Visited Link” prepended to the name. For example: I don’t see the web, I hear it with a, link, screen reader.
I have noticed this construct become embedded into my internal dialog. My subconscious uses it as a way to indicate a link to another thought. External technology imitates internal technology. The internet acts like an external form of telepathy. It serves as a perfect metaphor for the collective consciousness.
These thoughts blended perfectly with the talk. Soon he asked a great question: “Who can tell me the two most useless words in a hypertext link?” Of course I knew the answer. “Click here!” A bunch of people seemed to agree.
Once again it brought home the notion that accessibility really affects everyone. To me, click here makes no sense. Until recently a blind person could not click anything. Now someone can on an iPhone/iPad, or if using a magic trackpad on a Mac, but for the most part blind people do all their navigation using the keyboard. Thus it means nothing.
He then asked how many people had someone teach them how to use a hyperlink. A few tentative people said yes. Then he asked how many people just instinctively knew how to use a hyperlink. Of course most did. Then he said my favorite sentence of the presentation: “Click here is postmodern. It’s like a stop sign that says ‘This is a Stop Sign.’” People already know how to use a hyperlink. You don’t need to insult their intelligence.
This got me thinking back to my first web browsing experiences. I think it happened on an online service called Delphi. Back in the good old days of 1994 we just had text terminals, none of this fancy graphical nonsense. We had to scroll through a page at a time. The text contained bracketed numbers [23]like this. At the page prompt you could type in the number to follow that link. And we loved it.
We also loved playing games. These online services had the first multiplayer online games. I particularly remembered one on GENIE called Federation II. I spent lots of money “studying for school” when that game came out. But when you think about it, we played the first multiplayer online games, and it just seemed so cool.
I’ve also always enjoyed interactive fiction. These text adventure games print a description of a room, and accept text input from the keyboard. They began in the seventies, peaked in the eighties, went underground, and now have begun to resurface partly thanks to portable devices such as the iPhone and iPad. They combine a story with source code in amazing ways. My interest resulted in an interview in the excellent documentary Get Lamp. I recommend it if you’d like to know more about interactive fiction.
With that thought, we can now explore the idea of the web as a text-base gamespace. If you picture a page as a two-dimensional space, you can consider hyperlinks as the third dimension, or Z axis. The links connect the levels. Just as text adventures foreshadowed video games, the web foreshadows a virtual reality. The links act like connections in the brain. The web behaves a lot more like an artificial intelligence than many of our contrived attempts. We’ve already done amazing things with augmented reality, overlaying the web on the real world. One day we may do the inverse, modeling the real world and its objects on the web.
He took a quick aside which I felt good about, so I will detour at this point as well. More often than not, restaurants just link to a PDF copy of their menu. I have called PDF the Pain-in-the-Ass Document Format since it came out in the late nineties. The worst experience happens when the PDF just contains an image scan of the menu, as opposed to the actual text. This makes it impossible for a blind person to read. David made the point that restaurants should stop thinking of it as just a simple posterboard, and more of an opportunity to give a whole interactive experience. I agree!
Technology has changed so much. When I started going online, bulletin board systems acted like village pubs. Online services came along and felt like little cities with shopping malls. The internet connects things in an even greater way. To me, putting something on the web sometimes feels like installing an art exhibit in a public toilet. David chose a more elegant metaphor, like a star in the Milky Way. Both work.
The tools to author hypertext have also evolved. For a long time, inserting a hyperlink meant putting in raw html code, <a href=“http://behindthecurtain.us”>like this</a>. I didn’t particularly mind, though it made the text far less readable. Emacs came out with a way to do it which worked well but still felt clunky. Now on my Mac I just select text with Shift-Command-left/right arrow or with VO-Enter. I then hit Command-K and insert a link. This works in many standard applications such as TextEdit and Mail, and it also works in MacJournal. This increased ease means increased use of hyperlinks. The ease of reading also increases artful link text.
We do have some problems we need to overcome. Right now, we have so many file formats. This already creates problems, and this will increase as time goes on and more data becomes irretrievable. We also need to solve the problem of persistence. If a page changes its address then links which pointed to it will become invalid. Use of URL shortening services has made this worse.
David closed with a good point. In 1999 links just took you from one place to another. Now things have become less linear. Instead of thinking of the web as just a place to put our stuff, we should think of it as a place to connect our stuff. This really wrapped the whole thing up well for me.
I have only addressed the major points and how they relate to my own interests. I recommend going to this talk yourself if you have the chance. No doubt you will come away with something valuable. I feel glad I went.
Unfortunately, I don’t think I can go to next month’s PANMA talk, which discusses Flash. You can easily guess my opinion of that. Fortunately, an event called BarCamp Philly will happen this weekend, and by all accounts I have to go. I hope if you go that you will introduce yourself to me. Their ticket system has already given me problems, which BarCamp’s staff has done their best to resolve, so already I see the beginnings of a good article. See you at the pre-party, hopefully.
Echolocating Sculpture: A Monument to Abstraction
October 27, 2012About six months ago I learned a skill called echolocation. By making a tongue click, a blind person can learn to see with reflected sound. Read that article first if you haven’t, as this one depends on it. Only one organization in the world teaches this skill: World Access for the Blind. They deserve your support. Every blind person who can should learn it.
During the intensive, my teacher Justin said that I could use echolocation to see sculpture. This intrigued me. Of course, I immediately wanted to go to the Rodin Museum and try it out. Justin said I should do it myself later so we could work on more practical things. I agreed, but really wanted to go. Today I had my chance.
My father runs the Seraphin Gallery. Once in a while he will ask us (his kids) to go to an art opening. Usually they have paintings, which obviously I can’t get too excited about. At least I’d get free wine. This time however he said they would have sculpture, so that peaked my interest. I told him of my plan to use echolocation to try to see sculptures.
Most art museums will not let you touch the sculptures, sometimes even getting quite mean about it. I recall a field trip to the Philadelphia Museum of Art. They really didn’t want me touching their precious sculpture, and made me wear gloves. This totally took away the appeal. Marble feels muddy under cloth. Too bad I didn’t know echolocation then.
Fortunately, tonight’s opening did not happen at the Philadelphia Museum of Art, and I could touch everything, since my father owns the gallery. I even got to have a chat with the sculptor, David Borgerding. I felt excited that I could touch the pieces, but I felt just as excited about trying echolocation to see something abstract.
I walked there with a friend of the family named Alex, who should have a blog of his own. We entered the gallery and I just started echolocating to find sculptures. I felt like a kid searching a room for treasures. And sure enough I found them!
They arose like dark forms, monuments of abstraction. I could scan and make out the major features. After I used echolocation, I would then allow myself to touch them and get the fine details, then I’d go back to echolocation to appreciate it at a distance and with a holistic perspective. You know that tale about the blind men touching an elephant? That would never happen with echolocation, which lets you see the whole structure instead of its discrete parts.
I saw a lot of waves, and appendage-like forms. Even the squares did not have perfectly square shapes. No right angles, just curves. The artist’s statement confirmed this. I gravitated to two in particular. The first reminded me of a sailboat. The second one reminded me of the monument to abstraction I referenced above. David actually took this picture himself, so there you have a picture of a sculpture taken by the sculptor. This one also had an amazing texture, since he made it out of bronze and polished it somehow. I think gold also played a part.
Hearing about these colors reminded me of another visual aide, the Color ID app I have previously used to watch a sunset. It accurately identified the colors of the metals as I passed the iPhone over the sculpture. The app has exotic colors which I enjoyed in this artsy setting, especially Almond Frost, whatever that means. The simple colors proved more practical in a basic sense, grays and browns mainly. Now I had three ways to appreciate this art: touch, echolocation, and the Color ID app on my iPhone. This gave a very complete picture.
While discussing all this with the artist and others, I realized something else. Normally I use echolocation in every-day settings, such as finding a path, following the shoreline of a building, or enjoying the organic patterns of a tree. Now for the first time my brain saw something completely abstract. It tried to put names to the forms but ultimately could not. The artist agreed saying: “It’s nothing, and it’s everything.” This made for a novel experience. The visual centers of my brain felt satisfied and saturated.
By this time the crowds had begun filing in, making echolocation less effective, especially for appreciating aesthetics. The wine went to my head and I felt like eating. Alex and I walked to a nearby restaurant. By the time we returned, the showing had ended. I look forward to appreciating sculpture again, especially now that I can see it. I like sculpture!
RubyMotion Rocks!
October 04, 2012Ever since I started using an iPhone, I have wanted to learn how to write apps for it. I made several attempts to learn Objective C, but it never worked out. Then one day I learned about RubyMotion and it changed my life forever, just like the iPhone itself. I have just finished the tutorial and have a basic understanding of how to write an app. RubyMotion rocks!
After Steve Jobs left Apple, he formed a far out computer company called NeXT. They developed what they hoped would become the next amazing computer, especially in educational institutions. They wrote a custom operating system called NeXTSTeP. They created a new programming language for it called Objective C. It combined the standard features of C with a unique object oriented syntax, including keyword arguments.
When Apple bought NeXT and hired Steve back, they decided to use NeXTSTeP and Objective C. This became the core of what we now know as Mac OS. It then found its way into iOS. To this day, many objects start with NS, such as NSString and NSURL. The NS, of course, stands for NeXTSTeP. Seeing NS always reminds me of the whole story, and how one never knows how their accomplishments and actions will influence the future. NeXT failed, but their work succeeded.
Since I wanted to write apps, I had to learn Objective C, or so I thought. As I have written previously, I’ve never had good experiences with C. It reminds me of a bad relationship – you try to make it work, but it just doesn’t.
I began to assume I could never write apps, but remained hopeful.
One day, I read an article on Cult of Mac about a hot new iOS programming course called Tinkerlearn. It attracted my attention because it has the lessons within comments in the source code. Programming languages include a way to leave comments which the language ignores. This lets us mortal humans document our code. Embedding lessons in comments seemed very creative, not to mention highly accessible. It cost $14.99 so I figured why not? I bought it and fired up XCode, Apple’s development environment, the program you use to write apps.
In the past I have joked about a correlation between programming in C and drinking beer. This applies to all dialects of C. In the case of Objective C, forget the language, I felt like I needed to drink a beer just to use XCode. I don’t know how it looks to sighted people, but to me it seemed like a very complicated program to do a very complicated task, with lots of very complicated controls and strange areas of the screen to do all sorts of esoteric things, when I just wanted to write and compile a program. And this from an Emacs user! Nevertheless I trudged on, and started to get into the course.
I emailed Parker, the author, and we struck up a dialog. I told him of my hope to write apps, and the challenges of a blind developer. Specifically I wanted to know about designing interfaces programatically. Sighted people use something in XCode called Storyboards to visually lay out the screens of the app, then they add hooks to these elements. VoiceOver could read none of it, so I actually have to create the interfaces with raw code. Some sighted people also prefer this, and Parker actually released a modified version of one of his lessons specifically to teach this. I felt overjoyed, but confused.
After I wrote an article about how much I loved Ruby, Parker wrote me back on Twitter, agreeing with me. I wished aloud that I could write apps in Ruby, fully knowing of Apple’s restrictions. Parker wrote back and asked if I had ever heard of RubyMotion. It lets you write iOS apps in Ruby! Really? And it uses standard terminal utilities! Really? And you can use your favorite text editor. Really? I emailed Laurent Sansonetti,, the author of the program, and he said if it didn’t work he’d refund the $200 price. I figured I’d spend that on headache medicine if I continued learning Objective C, so took the plunge.
Have you ever visualized something, but just figured it would forever exist in your imagination, only to one day find out that it really does exist? You get a very weird feeling actually seeing it on the physical plain. I felt exactly like that when I first got RubyMotion working. It felt like the spirit made flesh, like a dream made real, and like the way I could finally write apps. Welcome to the future!
I just finished the tutorial by Clay Allsopp. The entire RubyMotion community feels exhilarating, and has given the utmost help in my unique situation. I can’t say I can write an app, but I actually understand the basics. Most importantly, I understand the way different subviews combine in a main view to make what you see on your iPhone when you run the app. I still have a lot to learn, but for the first time the pieces have started sliding into place!
One day you will see my apps in the app store. I have lots of ideas to change the world and make people laugh! RubyMotion makes it all possible. I’ll take Ruby and Emacs over Objective C and XCode any day! Dreams really do come true.
Why Ruby Works
September 08, 2012I recently wrote that I had begun drinking beer and coding in C++ again. I wondered about a possible correlation. I found one. Both can cause headaches. I’ve gone back to using Ruby, my favorite programming language and my birthstone.
Object oriented programming has intrigued me, has it has many others. It attempts to model the real world. An object has data fields called instance variables, and functions to run on the data called methods. Classes can derive from other classes. For example, you could have a Vehicle class, then a Car class derived from the Vehicle class – a car represents a kind of vehicle. The Car class would include all the functionality of the Vehicle class plus whatever else. I once joked that objects just keep going down forever, like the native American belief of turtles stacked on top of one another which support the world. While it might make for an interesting thought experiment, I must partially recant it.
I love Ruby. I’ve loved it ever since I started learning it. It started in Japan and has grown from there. It has a very elegant and uniform syntax. This happens because of its philosophy.
Ruby truly treats everything as an object. In most languages, if you want to get the absolute value of the number 5, you’d write abs(5). This calls the abs function with 5 as its argument. In Ruby you’d write 5.abs. This calls the abs method on the number 5, an object of class Fixnum. Fixnum itself derives from class Integer, which derives from Numeric, which derives from Object, which derives from BasicObject, which derives from nil. See the pattern?
All objects belong to a class. All classes derive from another class. At the least, they derive from the class called Object, and All objects derive from nil. In other words, all objects come from nothing.
To me, this sounds very Zen. It reminds me of the zero point experienced in deep meditation. All things come from the no-thing. This void energy contains infinite potential, and brings creation into existence. Ruby models this truth, and this makes the language work. A consistent philosophy produces a consistent syntax.