Monthly Archives: May 2015

Deaf fish in a hearing pond, Part 2

This post is Part 2 in a series. You can read Part 1 here.

Functional Requirements (Hardware)

The first thing I did was outline the functional hardware requirements necessary to roll my own communications setup.

  • The total combined setup must be as compact and as lightweight as possible, preferably less than 4 pounds total.
  • Each device in the setup must be able to discover and communicate with each other wirelessly with as little on-the-spot configuration as possible.
  • Each device in the setup must be able to go from sleep to app as quickly as possible.
  • Each device must have a physical keyboard with an intuitive layout (don’t put the punctuation keys in stupid places, I’m looking at you Dell)
  • The entire setup must use only off-the-shelf parts and devices.

Functional Requirements (Software)

Next, I outlined the software requirements.

  • The user interface must be as obvious and intuitive as possible, with no verbal or written explanation needed for how to get going with it.
  • Must be able to go fullscreen and hide distracting OS elements.
  • Must be as close to 100% keyboard driven as possible, with the exception of a minimal number of touch buttons, ideally just one.
  • Must make as much use of the screen real estate as possible for the text elements, which should be large and readable.
  • Connection should happen automatically with no user input.
  • Must be able to run on any hardware, at least during prototyping.

Inventory

The prototyping process for the software was the first thing I wanted to tackle, so for development purposes I was going to use whatever hardware I already had on hand.

I happened to already have a Windows 8 tablet and a couple of Bluetooth keyboards, so I designated it as the testing machine. My iMac is the development machine and will simulate the other side of the conversation for now.

I also have a hilariously tiny HooToo TripMate Nano travel router and an Anker Astro E5 16,000 mAh portable power pack, which are both already permanent residents of my messenger bag. When connected to the Astro E5 for power, the TripMate Nano provides a wireless LAN that can run for a ridiculously long time, and the whole thing can be ignored at the bottom of my bag for all practical purposes. I can tether the Nano to my iPhone if I need WAN access for any device connected to the Nano.

I decided to use that setup as the connectivity backbone during the prototyping process–both communication devices would simply be connected to the Nano’s LAN to simulate an active device connection. Later on, this can be replaced by a direct Bluetooth connection or something similar.

First Prototype

For funsies, I did the first software prototype in Unity3D because it was a convenient way to do some research I needed to do for my day job anyway–specifically, getting to grips with Unity’s new UI system and experimenting with networking. I like to catch as many birds with a single stone as possible!

A weekend and some evenings later, I had a functional prototype that I tested on OS X and Windows. I deployed it to the tablet, fired it up, and had a number of pointless conversations with myself.

f2f_v1_screenshot

It worked great. The only problem I had with it was that Unity is overkill for something like this and my poor Dell tablet quickly heated up to the point where I could’ve probably fried up some eggs and sausage patties on it. But, that aside, the basic idea seemed sound, I’d learned some new Unity UI and networking tricks, and now it was time to move on to a serious prototype.

Second Prototype

The second prototype was done as a web app, to be served over the LAN hiding in my messenger bag. This way, I don’t have to screw around with building app packages for a bunch of devices during the prototype cycle, and all I need is for each device to support Google Chrome. From there, I just create an app shortcut for that page on each device, then set it up to run fullscreen. Then when I tap the icon for that app shortcut on the start screen, it pops up in fullscreen all ready to go.

Chrome has fairly robust WebRTC support, which means you can directly connect two different machines and transfer information between them without it having to go through a server first. The only thing handled on a server is the initial connection setup, and after that it’s peer-to-peer data exchange between connected clients.

So, the second prototype uses WebRTC to pass data between clients. I whipped up a basic web app using Macaw and Atom, and tested that on multiple machines.

There’s an intro screen that tries to get the point across, with one button to start the chat.

f2f_p2_screen1

That button takes you to the actual chat screen, which again tries to explain itself as succinctly as possible. At this point, the WebRTC connection is automatically made and the chat is initiated.

f2f_p2_screen2

Here are a couple shots of the web app in action on the 8″ tablet, which has a Microsoft Wedge keyboard connected to it via Bluetooth. Any device on the portable LAN can serve this app if a lightweight HTTP server like Mongoose is running:

2015-05-24 18.04.13

When a connection is made between both clients, the avatars are swapped out with little webcam thumbnails. That’s why you can see me taking the photo in both thumbnails.

2015-05-24 18.08.15

The second prototype is much less demanding on the tablet than the first, and it works just as well as the first prototype did. It doesn’t turn the Venue 8 Pro into a George Foreman Tablet either, so eggs and sausages everywhere can now breathe a sigh of relief.

Next Steps

The next thing I want to do, since it’s all working now, is source a couple of cheap tablets and keyboard folios, then test this setup out in the wild. I’ll take Mrs E out for supper or coffee and we’ll see if any problems occur, then address them as needed. I’ll post the results in Part 3.

Update: Part 3 is up now! You can read it here.

Deaf fish in a hearing pond, Part 1

Introduction

As a deaf person who works in a predominantly hearing environment, I have a keen interest in anything that helps me break down communication barriers and engage with hearing people on as close to an equal footing as possible. So, every once in a while, I review the status of the market for accessibility aids for deaf people to see what’s new.

These accessibility aids range from physical devices to services and software that help deaf people communicate with hearing people. The common constant I notice during each market review is that pretty much everything I come across has one or more show-stopping problems. However, along with the show-stoppers, there’s usually at least one good idea behind each of those otherwise hilarious accessibility aids.

I was particularly intrigued by the UbiDuo 2, a device that lets people chat face to face in realtime.

UbiDuo 2

It’s a clamshell device that unfolds and splits into 2 separate devices that communicate over a wireless ZigBee connection. Each person types into their unit, and the other can see each keystroke happening in realtime. This makes for a much more fluid conversational experience compared to, say, paper and pen.

The problem with IM clients and conversing on paper is that there’s an inherent lag in the communication process. Using an IM application, one person types, and the other person twiddles their thumbs waiting for the typing party to finish and send. With a notepad, one person twiddles their thumbs waiting for the other party to finish scribbling and hand over the notepad. It’s just…clunky.

With something like an UbiDuo 2, there’s no wait. You’re watching each keystroke happen and seeing the other person’s thoughts being composed in realtime. It may not sound like a big difference, but it is–if you’re a hearing person, imagine how silly things would be if you could only communicate with other hearing people by dictating your message into a tape recorder and then passing the tape recorder to the other party, who then listens to the tape and then records their response. That’s the one brilliant point in the UbiDuo’s favor.

In the flaws column, there are a number of significant issues with it that limit its practicality for me. The unit is about the width and length of a 15″ laptop and weighs 4 pounds. Additionally, the MSRP of $1,995.00 is something I have an issue with. I just can’t bring myself to drop that kind of dough on an unitasker that I can’t fit into my messenger bag along with the rest of my other stuff, and I’m not about to load myself down with extra bags like some kind of tourist.

That got me to thinking, and since I like challenges, I decided to see if I could roll my own functionally equivalent setup for a fraction of the size and cost, and have some fun with it in the process. I’ll be documenting the journey here as I go along.

Update: Part 2 is up now! You can read it here.

I got a new hearing aid!

We went to the audiologist the other day to get me fitted for an hearing aid (an Oticon Chili SP9), and I was blown away by how far hearing aids have come since the last time I had one in…1997 or 1998, I think it was.

My previous hearing aid was an analog aid, and it pretty much just amplified everything that was going on around me, including all the irrelevant noises that people normally tune out, and the sound quality wasn’t particularly great. I couldn’t tell music from speech, I had no speech discrimination whatsoever, and everything sounded kind of the same. It just wasn’t a great experience.

These new digital aids are something else. They’ve got bells and whistles out the wazoo and do a nice job of filtering out the background “static” and just focusing on important sounds, and instead of using 3 tiny dials adjusted with a screwdriver on the hearing aid, the audiologist programs the hearing aid on a computer. It’s pretty neat.

I don’t really know how to adequately describe the magnitude of the difference between my old hearing aid and this one. The closest I can come is equating the old one to a 13″ CRT television with rabbit ear antennas and bad reception, while the new one is more like a 30″ HDTV, but all of the shows are in a foreign language. It’s still going to take a while for me to get fully acclimatized to sounds again, but I can already tell it’s better this time around.

I’m wearing a loaner unit right now. My actual hearing aid and custom earmold arrives in 2 weeks, and I pick them up a week after that.

The other thing I really like? Every single analog aid I’ve had over the years was a hideous fleshtone color that’s more at home in Dick’s Bargain Dildo Emporium (D.B.A. Poke & Save) than behind someone’s ear. My new one’s a tasteful metallic graphite grey instead!