Interactive wayfinding has rapidly advanced over the last 5 years, turning this now familiar technology into an amenity offered in a large number of populated areas, mostly including hospitals, colleges and malls.
Many of my previous posts are dedicated to features and functionality of wayfinding. Staff directories. Mobile availability. Internal message boards, event calendars and interactive scheduling. Emergency Alert system. The extended features that can be implemented are only limited to the imagination of the client.
In an age where convenience and overly abundant choices are key, we want to make sure that the public has every service at their disposal. But what about the minority? Those in wheel chairs and with other handicaps? The visually and auditory impaired? In earlier articles I mention the 508 compliance with our apps. Each app has designated features to meet the needs of the handicapped, whether it is directions specifically to wheel chair accessible entrances, or installing kiosks on site that are physically altered to make access easier; there are features specifically designed to ensure ease of use for those users.
Going above and beyond compliance and regulation, there are now touch responsive maps and models that speak upon touch and interaction. The Center for Inclusive Design and Environmental Access (IDeA) at the University at Buffalo and Touch Graphics, Inc. designed, fabricated, installed, a series of touch-responsive talking models for visually impaired travelers. The interactive models were placed in three locations frequented by blind staff and visitors: the Technology Center, Carroll Center for the Blind, Newton, Mass.; Chicago Lighthouse for the Blind; and Grousbeck Center, Perkins School for the Blind in Watertown, Mass.
Each talking map presents the spatial layout of its immediate surroundings in a multi-sensory format that is usable by everyone, with a particular emphasis on the needs of the blind. All three installations rely on capacitance sensing to measure multi-finger touches on opaque, textured surfaces and shapes. The need to sense touches against irregularly shaped surfaces requires a different approach compared to flat touchscreens. In these examples, conductive paint was applied to plastic forms produced by 3D printing or CNC milling. Rooms, buildings, walking paths, roads, bus stops, or other map features that react when touched are created as individual, electrically isolated painted regions. The regions are connected by thin wires to sensors housed in the pedestal, and a computer handles all interactions and displays relevant media stored as sound clips and visual imagery. The sensors use a patented method (Landau & Eveland, 2014) of measuring finger pressure, and software permits building staff to “tune” the model, equalizing trigger-thresholds for each zone, to produce a convincing illusion of pressure sensitivity. Through user testing, the developers optimized sensing algorithms to ensure that users with different degrees of hand strength and dexterity found the systems easy and enjoyable to use.
This new app takes talking website made available to the visually impaired, to a whole different level. I am thoroughly impressed with the development, methods of design and the technology made available. However, I will say that given the strenuous design and development involved, this is not a cheap investment and will not be a solution for smaller budgeted organizations.
If this type of solution can be reduced in price, it might finally solve the problem of evacuation plans.
Absolutely. I found it interesting, but does seem to require a very high investment. Much the reason one of my companies released an app enabling location to build their own wayfinding at $49 a month. We always ran into that issue, with projects of this type being costly