You are on page 1of 19

1.

Executive Summary
Goal of the project: Our goal is to improve the standard of living of the visually blind community by providing them a means of education and employment and thus help in making them self-sufficient. There are over 400 million visually impaired pupil around the world of 110 million are blind. This huge community of people is plagued with many difficulties in their day to day activities because of lack of education and computer literacy which makes social interaction difficult. The standard of living of the community of focus is very dependent on the people who support them. Thus, to improve the standard of living for this community and making them self-sufficient and independent forms the identity of our project. The Problem: Lack of effective education and computer accessibility techniques, leads to lack of selfconfidence and making the life of a visually impaired vulnerable and cut off from the society. Education is the fundamental reason for the economical backwardness of the visually impaired community. Even if they go to special schools, the education received is limited due to inefficient techniques of learning. The education material is not easily available. The material is specially created using Braille printers and embossers making it expensive. Also, the material is bulky in size and difficult to maintain. Computer Literacy is still in question due to lack of effective accessibility tools. The current accessibility techniques which mainly consist of speech recognition and speech delivery do not provide a natural form of interaction for the visually impaired with the computer. Due to widespread use of computer, computer literacy has become essential for an individual seeking a job. Even basic computer education can enable a person to get jobs like that of data entry, call centres and others. Solution: By using a portable embedded computing device, the visually blind community can find effective methods of education, day to day office work and interaction with the society. The failure of current accessibility options in effectively adapting vision impaired individuals to modern technology and user interfaces, we propose to develop a Tactile Graphical Display device and a customized operating system with a specialized graphical user interface to provide the best accessibility by combining tactile, audio and speech feedbacks. The device is being developed with an agile approach with an expected release of a SDK (Software Development Kit) which can be used by developers to create applications, with specific interest of the blind in mind, and thus increase the functionalities of this device. Expected Result: Improve the standard of living of the visually impaired community by providing education, employment opportunities and ways of interacting with the society. The device provides a framework for the development of various applications like educational, social networking, entertainment and professional applications using which the user can learn as well as work effectively and thus improve their standard of living. Conclusion: Project iRIS provides the users, a natural way of interacting with a computing device and thus facilitates ease of access. With ease of access, the users can learn and work better thus accomplishing the goals of the project.

2.

Problem Analysis:

2.1 Background: In 2006, the World Health Organization estimated that there are approximately 314 million people around the world whose vision is impaired, of which 45 million people are blind. The situation at a regional level is much worse. With over 15 million blind people, India dubiously leads the statistics.

Figure 1: Blindness Prevalence statistics across the globe

2.2 Problems faced by the Blind: The life of a visually impaired is vulnerable and cut-off from the society. In fact, according to the Social Security Administration, USA, amongst the various problems faced by the blind, the few most important problems include maintaining independence and socializing. Interviewing academicians and social workers at various NGOs we enlisted the following as the major difficulties faced by the blind: 2.2.1 Education: Visually impaired students find it difficult to integrate and learn at schools for the sighted. Most of the general learning techniques used for students with normal vision cannot be used for the blind. Such students need special techniques for learning like making use of tactile images for learning shapes such as maps and scientific diagrams or audio translations. Also, the study material specifically for the blind is not easily available. It has to be specially created in the form of braille printed hand-outs using braille printers and embossers. Not only, is the material produced cumbersome, costly and difficult to store, also, the machines for the purpose of making these material are costly and difficult to maintain. 2.2.2 Computer Literacy: Computers today have become an inseparable part of our lives, making the difference not just at the work place, but also in our personal lives. Majority of the information today is present in digital form making it almost inaccessible to the blind. In such a scenario, the onus is to make such computing devices more and more accessible to the visually impaired. The problem with current accessibility options like screen readers and refreshable braille displays is that they force the visually impaired people to

imperfectly adapt to the existing GUI based interfaces rather than giving a more natural interaction to these devices for such users.

2.2.3 Socializing: Physical socializing for blind is difficult as not much stress is given to such socializing techniques at schools and unlike people with sight who learn to interact with others by seeing and modelling others, blind are unable to do so. Virtual socializing techniques like instant messengers and social networks are not user friendly for the visually impaired as they are heavily dependent on GUIs making them inaccessible. 2.2.4 Employment: The problems associated with education, computer literacy and socializing amongst the visually impaired society limits the employability of the blind. Such people are usually employed for vocational jobs like basket weaving, sewing etc. which are considered menial in the society when a basic computer literacy amongst the blind can help them perform more scholarly jobs like being a customer care operator or even data entry and dictation with adequate training.

3.

Project Analysis:
Considering the various problems faced by the visually impaired community, the only possible solution is providing an efficient means of education and computer literacy. Today, with majority of the educational material becoming digital and the dependence of computer literacy for employment, it is essential that we stress on computer literacy for the visually impaired community in order to achieve the goal of the project viz. improving the standard of living of the community in focus. There are a number of tools available in the market that makes the computer accessible to the visually impaired community; however, there are a number of drawbacks associated with these tools which we discuss below.

3.1 Problems with current accessibility options: The accessibility options available today for the blind can basically be categorized into hardware and software categories. Specialized hardware components, like braille displays, apart from being costly and lacking portability are insufficient to make existing GUI dependent user interfaces accessible to the blind. Such devices are only capable of translating certain portions of the interface (usually textual information) to braille which makes the computer only partially accessible for the blind. People are able to read certain contents being displayed on the screen; however, navigation between applications is extremely difficult for such users. Hardware components like braille embossers or indenters are also used to create material specifically for the blind. However, this material is bulky, cumbersome and difficult to maintain. The software components used by the visually impaired are variants of screen or desktop readers and voice commands synthesizers. Such tools depend on the tool tip information provided by application developers and UI Automation techniques to make the GUI based interfaces more accessible to the blind. Such components lack flexibility and depend extensively on speech deciphering technologies which do not give very high accuracy. Thus, the use of such hardware and software components for the purpose of accessibility is limited and there is a need for a better accessibility technique for the blind. 3.2 Tactile Graphics: 2D graphic representations on computer screens bring a lot of advantages especially that of speed for the sighted people. Thus, there is a need for such 2D representation for blind also and the answer to this has been the upcoming use of tactile graphics. Tactile graphics are images that use raised surfaces so that a visually impaired person can feel them. They are used to convey non-textual information such as maps, paintings, graphs and diagrams. This technique is used for educating visually impaired students; however, there hasnt been much effort to translate this technique to improve the interface performance for the blind.

Much study done in the field of tactile graphics has convinced us that graphical icons made up of arrays of dots can easily be recognized by touch. The key is in the development of graphical icons that can easily be identified by scanning their outlines using finger tips. This makes it simple for the user to access applications on desktop or functionalities in applications and thus makes the computer more accessible.

Figure 2: Translation of graphical icons to tactile graphic icons

Shown above are two commonly encountered icons viz. music and email. Use of these icons not only saves space on the user interface but also makes the user interface more navigable. The icons shown alongside the regular icons are transformations of regular icons to tactile graphics. These tactile graphic based icons can easily be reproduced on low density pin based tactile displays. The icons are designed so that they are easily distinguishable by the visually impaired by touch. With user experience the user becomes familiar with the interface and can quickly navigate amongst applications. Since the current user interfaces depend heavily on graphical images, such tactile graphic equivalents can be thought as of as the best possible adaption in accessibility techniques. 3.3 Proposed Solution: The failure of current accessibility options in effectively adapting vision impaired individuals to modern technology and user interfaces, we propose to develop a Tactile Graphical Display device and a customized operating system with a specialized graphical user interface to provide the best accessibility by combining tactile, audio and speech feedbacks. The device is being developed with an agile approach with an expected release of a SDK (Software Development Kit) which can be used by developers to create applications, with specific interest of the blind in mind, and thus increase the functionalities of this device. 3.4 EPA Technology and concept EPA cell EPA (Electro-active Polymer Actuator) is the new revolutionary technology which has been expressed to be the backbone of modern robotics that aims to develop artificial muscles that can be controlled by electrical pulses. Of various applications of EAP technology, one of the hot research agenda is development of tactile surfaces using this technology. The EPA based displays will be extremely inexpensive costing 1/10th of the piezo-electric modules, additionally they are highly compact and best fit to be used as tactile displays due to lack of any sort of moving parts. The research in the field of EPA based tactile displays is promising rather few of the displays and devices are on their way to the market. With availability of such an amazing technology, the only question that remains is how we are going to make use of it and project iRIS aims to provide an effective solution. The EPA technology is based on the electro activity of few chemical materials. These materials include Polyvinyl Alcohol, Poly Acrylamide, Poly N-Propyl Acrylamide etc. which respond to the applied electrical impulse by showing change in one of the physical properties such as volume, viscosity, Figure 3: An EPA Cell and its driving mechanism density etc. Working of one of the

concept the tactile cell based on EPA technology is shown in following diagram that uses polyvinyl alcohol as the EPA fluid. As seen in the diagram the cell is cylindrical cavity that is filled with PVA. The external wall of the cavity acts as the cathode. The upper face of the cylinder is covered by a thin layer of elastic material such as rubber/plastic that can expand according to the increase in the size of the PVA fluid. A small anode is placed at the basement of the cylindrical cavity. When the electric pulse is applied to the anode there an electrical field of a small magnitude is created between the two electrodes. In response to the electric field the fluid expands in volume. The expansion in the volume forces the liquid to exert pressure on the flexible covering on the face of the cavity that swell up creating a bubble. The most important fact that should be noted here that the amount of voltage required for actuation of a single cavity is around 1-2V. This means that an array of such cavities requires very less total amount of power which is a major factor that increases the battery life of the device. Also, few of the newer designs that use a latching mechanism require electrical pulse just for the time of latching the fluid in a particular stage i.e. UP/DOWN position. Once a position is achieved electrical supply can be withdrawn and the cavity maintains its state. This further increase the battery life and the size of battery required for the design. The chemicals and the fabrication process for the display are very inexpensive. Additionally the display is compact, portable and replaceable. References: http://news.ncsu.edu/releases/wmsdispignabraille/ http://www.freepatentsonline.com/5580251.pdf 3.5 The Device: The device consists of a tactile display and a set of input buttons. The tactile display is an array of pins which can be selectively actuated in order to create a desired graphical pattern (in this case, UI screens). The pins are raised or lowered by independent actuators that sit bellow each pin. The display size is so selected that the whole display can be more or less covered by palm of the user. A highlighted icon is indicated by a rectangular border around it. Once the user detects the icons on the screen he places his palm over the display. In order to navigate through the interface, user selects one of the available icons by highlighting it and clicking the Enter button. Change in the highlighting from one button to other is sensed due to raise and lowering of the rectangular highlighting making it easier for user to identify which icon is highlighted.

Figure 4: Concept Device

Whenever user encounters something he finds difficult to interpret, he can simply press the audio prompt key which will provide an audio feedback of the selected icon or he can make use of the tool tip displayed in braille for the selected icon at the bottom of the interface. The user interface is discussed in more details in the following sections. The device is also accompanied with a braille keypad for the user to input data in braille. However, the use of this braille keypad is optional. In case a user is accustomed to use a QWERTY keyboard then, that can also be used to input text to the device using the PS2 port present on the device. A number of navigation keys have been provided to help the user navigate through the interface. The shape of the navigation keys is chosen in a manner that it assists the blind in the identification of the appropriate key. Also, the navigation through UI is always assisted with tones, which make it easier to detect what action has been performed. The device also has a provision for programmable soft keys that can be used for specific purposes by application developers.

3.6 User Experience: As per our study, the use of tactile graphical feedback alone is insufficient for a visually impaired user especially with low user experience. However, tactile graphical feedback combined with audio prompting provides a near perfect solution to the accessibility issues. When touch feedback is accompanied with audio prompting, the user can confidently navigate through the applications. The confidence of the user with which he navigates through the interface is extremely important as it ensures the usability of the product. As has been noticed in sighted individuals, with user experience, users stop reading icon descriptions and tool tips, as the user gets trained to identify the application with the icon. Similarly in case of visually impaired, with user experience, touch feedback will be enough to navigate through the interface reducing dependence on audio prompting, thus heavily improving access speed. Since the tactile icons can be learned and remembered, we intend to create a directory of standardized icons which users will encounter as a part of the user interface leading to the development of a new symbolic language for the vision impaired individuals. While designing the user interface, we have defined few rules and created few UI elements so as to make device more and more ergonomic and boost accessibility. Following are few basic concepts of the iRIS UI. 3.6.1 The GRID: 'The grid' is the basic concept of iRIS user interface. It is an imaginary skeleton over which iRIS user interfaces are designed. The reason why such a skeleton is required for the development of the user interface is that if the icons are randomly scattered over the interface, then it will be very difficult for the user to identify the location of icons.

Figure 5: Difficulty in accessing randomly arranged icons on the interface

As seen in the figure bellow, the grid is an imaginary 4X6 table that divides the display into 24 cells. While designing custom applications for the host software, the UI designer will make use of this grid while positioning various elements of the interface. According the grid rule, each element should fit itself within the imaginary boundary of single grid cell or merger of two or more grid cells. Each cell is associated with a row and column number. Each time a UI element, for example an 'icon' is placed into a cell; the icon is identified on the grid by a row and column number. Thus any icon on the UI can be selected by using Row and Column Navigation pallet on device by tracing the corresponding row and column keys. This saves user time in hitting UP-DOWN arrows till they reach desired icon.

Figure 6: The 'GRID'

Also, the two Braille characters at the bottom of grid display row and column number of currently selected element. Thus if in case user loses track of current element in selection, he can simply read these tow numbers to reach the corresponding location. The grid allows fixed positioning of UI elements in a static manner. Thus user is confident that an element would never appear at any random position on the display making UI more reliable and ergonomic. 3.6.2 Navigation Pallet The accessibility of the display is enhanced by the use of a set of hardware buttons. These set of buttons act as row and column selectors and together form the navigation pallet. The row and column selectors correspond to the grid which we have just discussed in the section above. Thus there are 6 column selectors and 4 row selectors.

Figure 7: Selection of icons using navigation pallet

At any point of time, if the user wants to select a particular grid element he can select the corresponding row and column selector and thus make a particular UI element present in that particular grid active. The select button on the side panel of the device can be used to select and thus start the functionality provided by the icon in focus. 3.6.3 Text Area and Scrolling Pad The textual information is always displayed in text boxes. A text box is identified by its characteristic shape which can easily be identified by the user by scanning its boundary. The text box is formed by combining more than two cells of the grid. As can be seen alongside the text box is made by of a grid array of 1 x 3. The text boxes can be of two types, Read Only and Read-Write. Read Only Text Boxes represent text areas where text data is retrieved and displayed for the user to read but not edit. On the other hand the Read Figure 8: Concept Text Area Write textboxes are editable UI elements. In such elements the user can not only input text but also perform simple text editing like cut, copy and paste. It may so happen that during some operations the text boxes have to accommodate more text than its actual size. In such case we have developed a provision for a vertical scroll bar. By default a text area does not have a scroll. However in case of extra text to be displayed the text area has a small scroll symbol indicating that the text area is scrollable. The presence of scroll and possible directions of scrolling are indicated by the UP and DOWN arrow symbol at the right end of text box. At the same time the scrolling pad to the right of the display area becomes actives and indicates the location of the cursor with respect to the entire text area contents. Once presence of scroll is detected, the user can select devoted scroll bar section on device using the scroll bar key located on top of it. The scroll bar section hosts scroll arrows and scroll peddle that indicates the amount of scroll possible. Once the scroll section is in selection the user may scroll up or down using the UP-DOWN navigation keys. Tool Tip During use it may so happen that a user needs to know what a particular icon or element on the display is or even sometimes need to know which part of the display is active. In such a case the user may make use of the audio prompt button that speaks out the name of the icon that is currently active. The user

3.6.4

can also get a tactile feedback of the same with the dedicated tool tip present just below the display area and above the braille keyboard. The tool tip is basically an accessibility tool for the user that provides a sense of direction to the user of where he is in the display region. The advantage of having a dedicated tooltip even though an audio prompter is present is that the tool tip will provide a tactile feedback which may be preferred by the user as compared to the audio feedback. Just like screen readers, the audio prompter may be confusing for the user because of reasons like the accent of the speech and other ethnic reasons. The redundancy brought about by the presence of both the audio prompter and the tool tip makes the device a robust accessible device. 3.6.5 Command Box To improve the ease of access to the functionalities provided by the device, the software supports type in commands that activate the programs typed in by the user. For example if the user is using a text editor and needs to open up the email client to check mail updates, he may simply make use of the command box and type in mail to open up the email client. The command box is activated by one of the programmable soft keys provided on the left panel of the device. This key being programmable can essentially be disabled by a programmer when a particular application is running but overriding its functionality, thus providing flexibility and security to applications and application developers. Host Application The host application is the main application that is written by device manufacturers. This host application is the central aspect of the software that is installed onto the device. The host application is analogous to the operating system of the devices whose main function is to identify and display all the custom applications that are installed on the device for use by the users. The host application is the start screen of our device. It displays an iconic view of the manufacturer installed custom applications and also the most used applications on the device and also an icon for other custom applications that are installed on the device. These other applications installed are segregated and displayed to the user on the basis of the classification discussed below. On selecting any of the icons the corresponding application is launched for the user to use.

3.6.6

Figure 9: Host Application Interface

3.6.7

SDK and Emulator The project aims at the development of an SDK and emulator that can be used by developers to develop and market custom applications for the users. The SDK contains the basic methods that are involved in

the developing of the application which thus provides and ease to the developers in the development of the applications. The emulator is basically a simulated form of the device which can be used by the developers to test the applications. The emulator provides a realistic understanding of the hardware display that is used by the users. The presence of the emulator helps the developers as it eliminates the need for the hardware needed to develop the applications. 3.6.8 Computer Accessibility The solution in focus can also be used to make a computer accessible. The device can be attached to a computer (desktop or laptop), thus making that particular computing device accessible using our device. The device when attached to the computing device installs the host application and the associated custom applications. After the installation the host application starts up and the applications can access the file system on installed computer. It searches and collects the readable documents on the computer and makes it available to device user when he launches an application which uses such files. The basic requirement is that the host computing device runs .NET Framework.

3.6.9

Custom User Applications The device will run a number of applications with specially designed user interfaces which depend on the display and the accessibility required by the users. The applications can generally be grouped into the following categories: 3.6.9.1 Educational Applications These applications are usually built with a specific interest to provide some knowledge or learning value to the user. Such applications can either be standalone like a customized client to access material from sites like Wikipedia using the APIs provided or they can be a centralized application. Such centralized applications can be used openly in the community to upload tutorials to a database running on a server. The user can download such tutorials using internet facilities and then use them to study offline. The uploaded tutorials have a predefined format which enables teachers and academicians to create and distribute study material specifically for blind students and acts as a replacement to braille handouts. 3.6.9.2 Social Networking Applications These applications depend on internet facilities and help the users to interact with the society. Sample applications in this group include accessibility customized instant messenger client which uses MSNP protocol used by Windows Live Messenger to chat with other users. A social networking and micro blogging application in the form of an accessibility customized Twitter client will be developed using Twitter APIs. Other clients for social networks like Facebook, MySpace and Diggs can also be created. 3.6.9.3 Entertainment Applications The application grouped under this domain contain simple games like Tic Tac Toe and other media applications like music player with the help of integrated Windows Media Player in Windows Compact 7 and music composers. 3.6.9.4 Professional Applications The applications in this group are more work oriented. Some sample applications include accessibility customized email client which uses POP3 protocols to retrieve mails from Hotmail and Live user accounts and SMTP protocol to send mails to the other people, Organizer for managing day to day activities, Note Taker for taking notes etc.

4.

System Architecture

4.1 Hardware Architecture

Figure 10: Hardware Architecture

The device basically consists of a tactile display, input buttons, controlling hardware and a custom software. The aim of the device is to interact between a visually impaired individual by generating tactile graphical pattern on the display. METEC, a Germany based company manufactures tactile modules which can be arrayed together to form a bigger tactile displays. We communicated with METEC to obtain a full datasheet to understand the physical build and the hardware configuration of the modules. With the data in hand we designed the supporting hardware to interface our system with the modules.
4.1.1 Tactile Graphic Display and Input Buttons: It is an array of pins which can be selectively actuated to rise up or down in order to create desired graphical pattern. The actuators are driven by DCM unit. The Braille keyboard and joystick are used in order to navigate the user interface.

4.1.2

eBox:
The Vortex processor based eBOX is at the heart of iRIS device. The eBOX acts as the central processing unit for the iRIS device. It runs a customized Windows Embedded Standard 7 OS image and iRIS host application. Since the eBox is designed to be a portable, low power, battery efficient embedded platform is an ideal component for our application. When the iRIS application runs on the eBox various UI screens are generated. These screens are generated as tactile patterns to be created on the tactile display. The iRIS host application generates these screens in the form of a matrix called as the "Display Status Matrix" that hold the position information for the tactile actuators on the display in binary format.

4.1.3

Display Controller Module (DCM): The tactile display, as mentioned before is made up of thousands of actuators that are individually controlled to create a specific tactile pattern. In order to independently and selectively control these actuators an electronic control mechanism is needed. Since the Atom board cannot provide such a large number of pin count that can control every single actuator a discrete mechanism has been designed that acts an interface between the Atom platform and the actuator assembly. This mechanism is termed as DCM viz. Display Controller Module. The DCM module consists of an ATMEGA 128 based microcontroller board and an a custom designed actuator array control mechanism.

4.1.4

The ATMEGA 128 based controller board: The ATMEGA 128 board is based on ATMEL's AVR series ATMEGA 128 microcontroller .The board is designed with ATMEGA 128 microcontroller as the CPU and various ports are broken out to embedded peripherals including joystick, LCD display etc. The ATMEGA board can be interfaced with external CPU in multiprocessing environments (in our case the master CPU being Atom Processor) by RS232 based serial communication by programming the Dual UART ports on the controller. The firmware can be downloaded on the board by using its In System Programming capability. The firmware can be a simple embedded C program or a simple embedded RTOS such as VSWorks. The control assembly: The control assembly is designed to enable independent selective actuation of tactile actuators. The control mechanism uses shift registers and array technique to utilize the available pins on the microcontroller and virtually multiply them to for actuation of thousands of actuators on the display.

4.1.5

4.2 Working, Data Flow and Data Processing: 4.2.1 METEC tactile graphical display module: The METEC tactile display module forms the functional unit of the tactile display unit for the tactile display of our device. Let us understand how the module works. 4.2.2 Physical construction and working: The METEC tactile graphic module works on the principle of piezoelectricity. The module consists of strips of piezoelectric material which when supplied with the control voltage undergoes physical bending. The bending thus pushes a small part of strip out of an opening on the face of module. Figure 11: METEC Tactile Graphic Display Module When user moves his fingers over the face of module this part can be felt as raised pin. Thousands of such pins are incorporated in a tactile display by arranging the modules to form an array. There is 10 such actuator pin on each module which can be selectively and independently actuated to create a desired pattern on the display. Electronic Interface: The module consumes 5V DC power supply at 80uA per cell when all the pins on the modules are actuated. Considering the low current and voltage requirement the module makes a perfect fit to be used in a portable embedded device that runs on a battery power.
Figure 12: Power requirements of METEC Tactile Display Module

4.2.3

4.2.4

Data Communication: Every module is an intelligent and independent functional unit of the tactile display. In order to actuate desired pins on the module we need to pass a 16 bit data block that contains the status information for 10 pins on the module.

The data block is binary in nature. 0 corresponds to actuator down and 1 corresponds to actuator up position. This 16 pin data block has to be transferred serially to the module using the MISO (Master In Slave Out) port on the module. Once the data block is received, the module reads the data and accordingly actuates the pins. Whenever the module is communicating data the LE actor pin on the module has to be provided with high potential (5V). This pin acts as a control pin for the module which is utilized to enable/disable the module for data transfer.

4.2.5

Data flow and processing within the system:

Figure 13: Data Flow within the system

4.2.5.1 eBox: While running the iRIS application we see that the UI screens are generated by the applications. These screens are displayed on the tactile display in the form of a tactile graphics. Every time a new screen is generated the previous screen on the display has to be erased and a new screen had to be written on the display. We call this event as Screen Refresh Event. The iRIS application generates these screens in the form of matrix called "Display Status Matrix". The matrix holds the information about status of each actuator for the given screen in binary form. Value 0 for an actuator corresponds to Down/lower position (absence of dot) and value 1 corresponds to status Up/Rise (Presence of dot) on the display.

Every tactile module needs to be fed with a 16 bit data block that contains the status of actuator pins on it. The 16 bit data block has to be fed serially and only one module at a time. Since there are hundreds of tactile module present on display, every module needs to be selected one at a time to pass the data block and then move to the next module. In order to achieve such selective data feeding to the individual display modules a mechanism is required which can select a module at a time while the control system transfers the data block to it. The LE act/enable pin on the module is utilized for this purpose. While a module is accepting the data block from the eBox all other modules must be deactivated for data communication. This is done by disabling the modules by pressing a low potential (0V) on LE act pin. Only enable pin for the receiving module is kept at high potential while all other pins are kept at low potential. But as it can be seen that this technique demands hundreds of ports that can control hundreds of such enable pins. Since eBox cannot support such a large pin count we devised a hardware which will do the job for eBox. We call it as Display Controller Module (DCM). 4.2.5.2 Display Controller Module: The display controller module consists of an ATmega128 microcontroller board that acts as an interface between the eBox and the tactile display. The DCM unit is interfaced with eBox via RS232 serial communication port and programming the dual UART on the microcontroller board to enable serial data transfer. Whenever a screen refresh event is generated, the eBox generates a UART interrupt to microcontroller, which then initiates an interrupt service routine written on the microcontroller that begins the data transfer. During the transfer eBox sends the Display Status Matrix to the microcontroller which saves it on the on boards flash memory and decomposes it to from individual 16 bit data blocks for every module on the display. 4.2.5.3 The Display Control Mechanism: In order to generate larger pin count to control the enable pins on the tactile modules, we have devised a mechanism that acts as a pin multiplier to generate the required number of pins. This mechanism consists of 8 bit shift registers acting SIPO mode (Serial data In Parallel data Out). The 8 bit outputs of the serial registers are arranged in form of a grid. At every intersection an enable output is generated which is feeded Figure 14: Display Control Mechanism to the LE Act pins of modules. By feeding appropriate control words to the sift registers one intersection is made active at a time thus enabling one display module at a time. The control words are fed by DCM unit via UART 2 port serially. Once the module is enabled the microcontroller feeds the 16 bit data block corresponding to that module via MiSO port of SPI serial interface. This process is repeated to select every module one by one and thus it results in refreshing the display. 4.2.5.4 Data Flow:

Figure 15: Data Flow in the system

4.3 Software Architecture:

Figure 16: Software Architecture

4.3.1

Customized Windows Compact 7 OS Image: The software that exists on the device runs on an operating system that is customized and developed solely for the purpose of making the utmost use of the eBox provided for development. The operating system is designed to include the very basic elements so that the power consumption is minimum. The Operating System is designed using the Windows Compact 7 Platform Builder and makes use of the eBox 3310A MSJK Board Support Package. Tactile Display Drivers The processor that runs the software is interface to an innovative display module. Thus for the purpose of interfacing the display screen to the applications running on the processor we have a tactile display driver. The display driver basically maintains a two dimensional matrix that maintains Boolean values that indicate the state of the pin actuation on the display. To reduce the time delay caused by the change in the actuation state of the pins, only those pins are actuated whose state is changed. Thus, the

4.3.2

driver always maintains the previous state of the display and based on this state decides which pins need to be actuated. 4.3.3 Application Layer The application layer contains the basic application that runs on the device that is the host application. The Host application discussed more in detail below is the heart of the software that runs on the device. It interfaces the custom applications to the device OS and drivers. User Layer The user layer contains the customized user applications that are installed on the device for the user. These contain applications that are developed and deployed with the device by the manufacturer and also some of the custom applications developed by the developers using the SDK that will be deployed for the developers. The classification and details of the custom user applications will be discussed in more details later.

4.3.4

5.

Market Analysis

5.1 Defects of conventional Braille printings 5.1.1 The category, quantity, and the information capacity are limited: In India Braille All India Confederation for Blind is the sole Braille publishing unit. It publishes About 10000 copies of books in Braille annually, covering merely about 200 fields. A Braille book contains no more than 200 pages and at most 80 thousand words. 5.1.2 The cost is high: Braille books published are made of swell paper, and a book of 200 pages costs about 1000 INR. In field of culture education nowadays, only 200 kinds are published, and the average price of which is 2500 INR (about 50 dollars considering the price level) , which is 20 times higher than normal books we read, since the technology of fabrication is complicated. 5.1.3 The books are hard to use and transport: If we translate a normal book into a Braille book, the weight of the Braille book would become decuple times of the normal one, and the size would turn to four times bigger. Braille books should be taken strict precautions against dampness and extrusion, and they are easily nipped and inconvenient to transport by mail. The operational life span of the books is short: Since the words are printed in Braille, after times of touching and bearing for reading, the words will be blear, imprecise and unreadable.

5.1.4

5.2 Current options in accessibility tools in market and cost analysis: The alternatives to hard printed Braille books available in market are refreshable Braille devices and screen readers. These units use a single line of refreshable braille modules to display information. These modules make use of piezoelectricity based actuators which are very costly. Thereby these devices are at a very high price and cannot be afforded by a common man in developing countries. Besides the cost their utility is limited since these devices are hardcoded for their functionality. They can only perform basic functionality like reading e-books, taking notes and checking emails. With only single line available as interface, possibilities of developing applications is slim. Thus, rendering such devices as a rich man's toy. Following is the list of major manufacturers in this refreshable braille device category and products:

6.

Cost Analysis:
It is an obvious question that would come to anyone's mind that if refreshable tactile devices that use only single line of braille are so costly how could a devices that make use of multiple lines in much denser format be cost effective. When we started developing our project we also studied the cost feasibility of manufacturing such a display at mass quantity, targeting a common man in India as a customer. We searched for a lot tactile graphics based products including METEC, a German based company that manufactures piezoelectric based tactile graphic modules. With our communication with METEC we learned that METEC was manufacturing these modules only on experimental basis for the research groups working on tactile graphics based devices. With no mass production and due to unavailability of a robust design that could make use of such modules, the cost of a single module too was very high.

6.1 EPA Technology We communicated with many researchers and professors in Human Computer Interaction regarding feasibility of manufacturing the tactile displays. With one of the reply from Professor Richard Ladner from the University of Washington, EPA technology is the key to manufacture mass quantities of tactile displays. EPA based actuators as discussed previously make use of electro active fluids that respond of electrical pulses by swelling and shrinking in size. Since chemicals used for manufacturing such displays such as PVA (Polyvinyl Alcohol) are extremely cheap the projected costs are 1/10th of piezo based actuators. Additionally these displays lack moving parts thus are robust in nature and smaller in size factor. These displays are already on their way to market waiting for a technology that could make effective use of them. 6.2 Device Cost Estimation Material Tactile graphics display (EPA based) Embedded Platform (to run Win Embedded and iRIS Application) MCU and control mechanism Buttons / QWERTY Keyboard (optional) Battery Covering and other build materials Total Projected cost $200 $100 $50 $20 $50 $50 $470

6.3 Target Market: With the huge number of blind population in India, a common blind person which may be student or job seeking individual is a target of the product. With approximate price of INR 20,000 which is lesser than that of a mid-range mobile phone.

With the application market for the device, intuitive user interface and great price device offers a great deal. Since the device would be an essential entity that plays major roles in education and employment it is not only a good product to buy but essential one to a blind individual. With government subsidies on the product, availability of funding from banks to disable community and mass orders from the blind schools, the price can even go below the projected price range making it affordable to everyone.

7.

Surveys, Field Studies and Testing


To understand the requirements of the blind we visited the Poona School and Home for the Blind, Pune and National Association of the Blind (NAB), Mumbai. Here, we met academicians and other people who actively participate in improving the conditions of the blind. These people initially provided us with information about the difficulties faced by the blind. With their help we were able to brainstorm and identify the problem at hand and also try to find a solution to the problem. To understand the feasibility of our concept we decided to test the concept of tactile graphical interfaces. We created small embossed images of the interfaces on paper. The embossed images were then given to the blind students to trace by hand with the help of the teachers at the Poona School and Home for the Blind. The teachers then gave us a positive feedback that with a little user experience the students will be able to identify the icons with speed and with the presence of tool tip and audio prompting the learning of the system is much higher as compared to the current techniques present. For the future, once the device is developed, we intend on field testing our solution with the blind students at Pune and make necessary improvements in our solution.

8.

History TimeLine and Project Status


Start Date End Date Duration Task Production Requirement Specification 5 weeks Complete Study of problems faced by the blind Accessibility s/w JAWS and accessibility techniques in Windows Visited Blind schools at Pune and NAB, Mumbai Resources Status

1 week 7 Jan 2011


th

Complete

11 Feb 2011

th

1 week

Study of existing accessibility tools Study and Interviewing academicians for blind about requirements Software

Complete

3 weeks

Complete

1 week

Operating System Development

Complete

4 days 12th Feb 2011 19th Feb 2011

Study of Windows Compact 7 OS Platform Builder

Complete

2 days

Analysing requirements of the OS Image Windows Compact 7 Platform Builder on Visual Studio 2008

Complete

1 day 1 week

Development of Operating System image Host Application Development

Complete Complete

20th Feb 2011

27th Feb 2011

2 days

Development of simulator image of the device

Visual Studio 2008

Complete

2 days

Conceptualizing the home screen interface

Complete

2 days 2 months

Development of the Host application Client Application Development

Visual Studio 2008

Complete

Visual Studio 2008

Complete

2 weeks 1st Mar 2011 30th Apr 2011

Text Editor

Complete Complete

2 weeks

Wikipedia

2 weeks

Chat Client

NA NA Hardware

2 weeks

Email Client

12th Feb 2011 20th Feb 2011 25th Feb 2011 20th Feb 2011 1st Mar 2011

19th Feb 2011 24th Feb 2011 1st Mar 2011 27th Feb 2011 30th Apr 2011

1 week

Atmega128 based DCM unit built and tested.

Complete

4 days

Firmware for ATMEGA board written to communicate with atom board and OS.

Complete

4 days

Integration and data communication testing between atom board and ATMEGA board.

Complete

1 week

Design of control mechanism for array based control of actuators done and related PCBs fabrication.

Complete

2 months

Tactile display mechanism for demo prototype has been designed and a small display is under fabrication.

On going

You might also like