Braille Monitor December 2004
(back) (next) (contents)
Accessibility to Microsoft Products
by Curtis Chong
From the President: In early September, I asked Curtis Chong and Ann Taylor to participate in a meeting with developers of technology at the Microsoft Corporation. As Federationists know, Mr. Chong serves as an assistant director of the Iowa Department for the Blind, and Ann Taylor is the manager of the International Braille and Technology Center of the National Federation of the Blind. When our representatives returned from the meeting, Mr. Chong sent me a summary of the events that occurred while he and Mrs. Taylor were in Redmond, Washington. Because we are reporting on the current state of technology for the blind, the report is particularly relevant. Here is what Mr. Chong wrote:
Des Moines, Iowa
September 13, 2004
Dr. Marc Maurer, President
National Federation of the Blind
Dear Dr. Maurer:
On September 7 and 8, 2004, Ann Taylor and I met with representatives from the Microsoft Corporation at its headquarters in Redmond, Washington, to discuss nonvisual access to the Microsoft Windows operating system and other Microsoft applications such as Office, Money, Terminal Server, and Internet Explorer. Overall the two-day meeting turned out to be a fruitful exchange of views and information between the National Federation of the Blind and Microsoft. As we have done in the past, we communicated our determination as blind people to have access to the applications we need to pursue our personal and career objectives; and our belief that a great deal of Microsoft's accessibility effort depends upon negotiation, persuasion, and cooperative relationships between its many and varied product groups was reaffirmed.
The Accessibility Technology Group
The Accessibility Technology Group (dubbed ATG within Microsoft) lies at the heart of Microsoft's accessibility efforts. The head of this group is Madelyn Bryant-McIntire. Ms. Bryant-McIntire has a strong engineering background and tends to address issues in technical as opposed to political or emotional terms. Microsoft employees who are actual members of the ATG seem to have a solid grasp of the requirements for accessibility and a strong commitment to making things accessible whenever and wherever possible. However, the ATG does not have any veto power over the release of specific software at Microsoft. Rather its strength lies in its ability to evangelize the need for accessibility with various product groups and to ensure that within each group there exists an accessibility champion, who can continuously promote the cause of accessibility as work on a specific product moves forward.
During our two days of meetings, we met with accessibility champions from a number of different product groups, and it was very clear that the overall goal of full accessibility to all Microsoft products has been and continues to be difficult to achieve--with mixed results across the various groups. Accordingly it is difficult to comprehend how accessibility is being mandated at the highest levels of the corporation.
Madelyn Bryant-McIntire provided a briefing concerning Longhorn, the next version of the Windows operating system. About ten years ago Microsoft began work on something called Microsoft Active Accessibility (MSAA), which was supposed to provide a robust mechanism for Windows applications to communicate with assistive technology--including screen-access technology for the blind. As it turns out, MSAA has been plagued with a few nontrivial problems. It is difficult for software developers to learn and implement, it does not provide all of the information that screen access programs need, and it has been used effectively in only a few significant applications such as Internet Explorer, Macromedia Flash, and the Adobe Reader.
According to Ms. Bryant-McIntire MSAA provides only about 20 percent of the information that screen access programs need to enable a blind computer user to use Windows applications effectively. Screen access programs grab the rest of the information they need with hooks which at best are unsupported and at worst unstable and unreliable. It is this latter mechanism that has been the cause for much of the instability that exists between Windows and screen access technology today.
Moreover, according to Ms. Bryant-McIntire, since the screen access technology vendors do not want to reveal proprietary trade secrets to Microsoft, Microsoft does not know how to protect the interfaces that are already working for specific screen access programs. The bottom line is that there is no compelling reason for developers to use MSAA, and screen access technology has used MSAA only for a small number of applications. What is needed is an approach which is easy to implement and irresistible to developers.
Microsoft is now promoting a new protocol variously called User Interface Automation or Test Automation. The idea here is to provide a programmatic way for software to be accessed so that such critical processes as software testing and validation can be accomplished without human intervention. As I understand it, this should allow one program to communicate with another program as if the second program were being controlled by a human--meaning full access to the keyboard, mouse, and video display. From a mainstream software development perspective, this is very desirable because it could enable lots of testing to occur automatically. Also as a natural consequence screen access technology should then be able to use this same interface to glean what information it needs. To sweeten the pot, Microsoft is proposing to use C Sharp as the implementation tool. Apparently C Sharp is easy to implement.
There has been a lot of uncertainty around this new concept. For one thing, screen access technology developers do not want to give up the proprietary (and very secret) approaches they have used to grab information from Windows. For another, no one knows for sure how this new concept will be implemented in Longhorn. Will Microsoft remove support for MSAA and the video hooks that screen access technology has come to rely on in favor of User Interface Automation? Or will it enable all of these components to work together?
At some point in the future it is clear that screen access software will need to be rewritten to take advantage of the new protocol, but in the meantime we were assured that when Longhorn is released, all three protocols will continue to be supported: the low-level hooks that screen access technology uses today, MSAA, and User Interface Automation. However, the writing is clearly on the wall. Microsoft intends to make a sweeping change, and at some point in the future the low-level hooks will go away.
Microsoft intends to enhance its Windows Narrator program to work with Longhorn. Narrator is a very basic speech program which is technically not a screen access program. Rather it is designed to receive information from various Windows components using either MSAA or UI Automation. Ms. Bryant-McIntire stated that Narrator is not intended to compete with existing screen access technology, which can distinguish itself by providing functions that increase user efficiency and access to important information.
When Narrator was first discussed way back in 1998, the screen access vendors were very nervous about it. However, over time Narrator has not proven to be a serious competitor in this market. As Microsoft has said, it provides very basic (and often not very desirable) speech access to some Windows functions. Under Longhorn, Narrator is intended to validate the concept behind User Interface Automation.
Aaron Solvet, who is with the Internet Explorer group, discussed accessibility efforts being conducted by his team. I think this was more of a learning session for him than for us. We explained that for the blind Internet Explorer was accessible only because its interface had been effectively rewritten by screen access software to resemble that of a word processor. We encouraged Mr. Solvet not to make any changes to Internet Explorer which would damage this interface, and we encouraged him to consider adding features to the browser which would enable the blind to have the kind of interface they needed without as much reliance on screen access software.
Microsoft Voice Command
We next heard from David Norris, who is the product unit manager for a product called Microsoft Voice Command. This software provides a "hands busy, eyes busy" spoken language interface to a number of pocket PC applications. It can be purchased from such mainstream outlets as Circuit City or CompUSA for around $35. You give it a few voice commands, and in some cases responses are spoken back to you. For example, you can say, "Tell me my next appointment," and the software will tell you that your next meeting is in twenty-seven minutes. You say, "Run calendar," and your calendar is displayed on the pocket PC's small video display (but not spoken).
Mrs. Taylor and I demonstrated to Mr. Norris that, for the blind, the software is still not fully accessible, and we urged him to continue adding more speech output functions. We also asked him if he would be willing to have Voice Command evaluated for nonvisual access by the staff of the International Braille and Technology Center for the Blind. He indicated that he was definitely interested.
Access by the blind to Microsoft Money was the first topic of discussion on the second day of our meeting with Microsoft. We met with Beth Woodman, who promotes accessibility with the Money group. Ms. Woodman indicated that she was the first person in the Money group actively to work on accessibility issues pertaining to the Microsoft Money software--both the Windows client and the Web site presence. Accordingly, while earlier versions of Money may have worked somewhat for the blind, Ms. Woodman reported that according to a blind Microsoft employee who tested the latest version (Kelly Ford), accessibility had taken a step backward. Ms. Woodman indicated that accessibility seems to be an uphill battle in securing necessary time and resources. Nevertheless, she indicated that plans are well under way to incorporate User Interface Automation into future versions of the Money product. My guess is that Microsoft continues to give priority to accessibility issues related to software that it believes to be important for employment. In this context Microsoft Money comes in second.
We also discussed with Ms. Woodman the financial services available through Microsoft on the Web. We expressed our strong desire for the Web-based application to meet all accessibility requirements so that everyone--including blind people using screen access technology--could use it. She indicated a willingness to work with the National Federation of the Blind in this regard.
Microsoft Terminal Server
We next heard from Emelda Kirby, who works in the Terminal Server group. The Microsoft Terminal Server is the platform that runs software such as Citrix Metaframe. Citrix Metaframe provides a mechanism to access a Windows machine remotely without having to run all of the application software on your local computer. The local computer runs the Citrix client and through the client communicates with a remote computer which is actually the Terminal Server platform.
So far Citrix has been inaccessible to the blind despite the best efforts of all screen access technology vendors. It was heartening to hear from Ms. Kirby that accessibility is a major priority for the Terminal Server group. Of course it helps that some government agencies, such as the U.S. Post Office and the Social Security Administration, eager to comply with Section 508, are prepared to invest time and money to try to achieve accessibility to the Terminal Server platform and, by extension, to Citrix Metaframe.
All of the major screen access vendors (Dolphin, GW Micro, and Freedom Scientific) have plans to take advantage of the accessibility work being done by the Terminal Server group. Unfortunately this work will probably not bear fruit until the middle of 2005, and then only if everything goes according to plan. According to Ms. Kirby the Terminal Server group is committed to using User Interface Automation to accomplish its goal of accessibility.
Mrs. Taylor urged the Terminal Server group to avail itself of the expertise in nonvisual access available at the International Braille and Technology Center for the Blind. Ms. Kirby seemed positively disposed to doing this.
A discussion of accessibility to Microsoft Office (both current and future versions) was the final topic of discussion, and since Office is used by just about every blind computer user today, we spent more time on this topic than on any other. We met with four representatives from the Office group. Mrs. Taylor and I began by pointing out that even under the best of circumstances accessibility problems still exist today with various Office products. In fact, we pointed out that with respect to Office 2003, the currently available version, full support was not yet available from all of the screen access programs. We expressed frustration with the continuing cycle of software releases by Microsoft which force screen access vendors to jump through hoops to keep up.
We were told that the Office group is aware of this problem. We also learned that by the time a particular build of Office is ready to be tested for accessibility, many things are already cast in stone, making it next to impossible to fix problems encountered during the testing process. We asked if it would not be possible for the National Federation of the Blind to become involved earlier in the testing process. We were told that lawyers needed to be consulted about this question.
We had some very useful discussions about what does and does not work for a typical blind user of Microsoft Word and Outlook. We told the Office group representatives that Microsoft Access (a database program) is still a fairly significant problem for blind users. We observed that Office 2003 represented a step forward in better access to some functions through the keyboard and informed the group that Office 2003 was not yet fully certified by the screen access vendors. We urged the Office group to speed up its efforts to implement User Interface Automation and to try to come up with creative ways to involve screen access vendors and blind consumers earlier in the testing process.
Overall I think the meeting with Microsoft went as well as could be expected under the circumstances. Representatives of some of the product groups heard from real live blind consumers and may have received insights that they never had before. We, on the other hand, learned something about how accessibility is handled at Microsoftthat is, it is still not truly a corporate mandate but rather something which various groups must be persuaded to incorporate into their product development cycles.
During our various meetings we continually urged Madelyn Bryant-McIntire to visit the Jernigan Institute and to have another meeting with the president of the National Federation of the Blind. Now that the Institute is a going concern, we said that opportunities for cooperative research involving the Federation and Microsoft were highly desirable. Ms. Bryant-McIntire expressed her willingness to come to Baltimore in the near future to see the Institute and to meet with the president.
As for the future, I am afraid that things will get worse before they get better. The User Interface Automation idea is a good one, and even though it will require some major software changes on everyone's part, once we get through the painful transition process, things should be more stable and functional in the long run. Of course we should remember that, when MSAA was first conceived, we were assured that it would solve many of our accessibility problems. Since it obviously did not, why should we feel any differently about the new protocol which Microsoft is now actively promoting? Perhaps I am being overly optimistic here. Only time will tell.
Curtis Chong, President
National Federation of the Blind in Computer Science
(back) (next) (contents)