PRESIDENT RICCOBONO: Thank you. I want to turn this next panel over to the president of our technology trainers division. She was last on this stage in 2018, directing technological innovation begins with the way we direct ourselves and one another. In the organized blind movement, we take responsibility for directing our own feature. That means being engaged in the technologies and the processes that have and make a difference in our lives. And making sure that we bring our authentic experiences to those. So here to lead our technology panel is Chancey Fleet. We don't hear you, Chancey.
CHANCEY FLEET: Good evening, everybody. It is my great pleasure to introduce and welcome four leaders from across the tech industry who have each made a commitment to accessibility in the blind community and beyond and who are working in their organizations each and every day to advance ease of access for everyone. First I would like to introduce and welcome up to the stage Jeff Petty who is the Windows accessibility leader at Microsoft.
JEFF PETTY: Good evening, Chancey.
CHANCEY FLEET: Next up, Eve Andersson, the senior director of accessibility at Google.
EVE ANDERSSON: Thank you.
CHANCEY FLEET: Sarah Herrlinger, senior director and global accessibility policy and initiatives at Apple. You want to do it for me? I'm so sorry.
SARAH HERRLINGER: No worries. It is a long one. Quite the doozy. I'm Sarah Herrlinger, the senior director of global accessibility policy and initiatives at Apple. It is a pleasure to be here.
CHANCEY FLEET: Welcome to Peter Korn, director of accessibility at Amazon Lab 126.
PETER KORN: Thank you so much for having me.
CHANCEY FLEET: Glad you're all here. We have four questions that our panelists are going to address. Let's launch into the first one. All of our questions were crowd sourced from our members. All of the questions you hear tonight are questions that members have asked.
While we're often delighted with improved accessibility and product releases, we're also sometimes nervous. We confront accessibility bugs a lot of the time that put a stop to our productivity to the extent that many of us avoid updating our products until we hear it through the grapevine that updating is safe.
So we're wondering how have you each of you, evolved your development practices to ensure accessibility at launch and reduce the need for accessibility remediation after the fact? And how can the organized blind and the tech industry work together to make accessibility as nonnegotiable as something like security when it comes to releasing a product? We'll start with Peter.
PETER KORN: Thank you, Chancey. Amazon was founded on four principles. One of which is customer obsession and our culture is codified into 16 leadership principles. The first of which is again, customer obsession. And we are many different businesses. We're a retail website. We're a grocery store. We're cloud computing. We make movies. We publish books.
I'm going to look specifically at the way we build products in my part of the world which is Lab 126, the devices like echo and Fire tablet and TV. These are all waterfall style development processes. They follow a chain. You can't do the next thing until you finish the current. And we always start with what we call a PRFAQ. A press release that frames the product in customer benefit terms along with a set of frequently asked questions about that product. So the first thing we did to evolve our development processes for accessibility is we added a required question to the list of questions.
How will this product be made accessible and is there anything about this product that might be especially valuable to customers with disabilities?
Next we define all of the requirements for the product. Can run over 100 pages. We have evolved to have a set of required accessibility questions that must be answered before the product can move forward.
And as the product moves through its various development gates, we add checks for is the industrial design tactically discernible, the buttons on the product, is the contrast high enough even of the lettering on the hardware? Do we have an accessibility test plan? Is the product team understanding what an accessibility blocker bug is? What a severe accessibility bug is and are we making sure that we're treating blocker and severe accessibility bugs the same way as we would treat that kind of bug that wasn't related to accessibility.
And so now to the question of making it nonnegotiable. I think there are two key ingredients. First product teams need to have a broad understanding of what it means to be accessible. They need to treat these bugs are the same urgency and as if they affected customers without disabilities. And teams need to understand what a delightful, accessible experience looks like and feels like. So you need people with disabilities in all of the product development roles to help realize that and train the team so that they understand.
When we launched the voice shopping accessibility team, one of our newer dedicated accessibility teams, the vice president of voice shopping who came from Kindle and had worked with the NFB and President Riccobono on Kindle accessibility, he said "we wouldn't launch a product for enjoying Japanese comic books if we didn't have people who loved Japanese books on the team, helping define and drive the product." So you who are wanting to start a voice shopping team, you need to get people with disabilities, with the lived experience to be part of that team. I think those are the key ingredients to doing this successfully at scale.
CHANCEY FLEET: Thank you. Let's next hear from Jeff.
JEFF PETTY: Thanks, Chancey and Peter. I'm going to take the second part first. I'll echo some of Peter's sentiments. I'll just say first as industry leaders, I think we can help everyone understand the importance of accessibility by talking about it publicly including how we are together today.
Microsoft first announced our commitment to inclusion, transparency and accountability about five and a half years ago and at the time, I recall that this was a powerful message to me as an employee. That Microsoft valued inclusion so much that we decided to be more transparent and ask our customers to hold us accountable. It was a critical signal to me and other employees, to our partners and to the world that accessibility is important.
Just this May, we announced new ambitions to achieve more by empowering one of the largest untapped talent pools in the world, people with disabilities. And to realize we're making focused investments in technology, in workforce and in the work place. Accessible technology including more accessible windows and office experience has the power to help tackle the disability divide. To contribute more education and employment opportunities for people across the world.
And while accessibility technology is important, it is not enough. We need to invest in workforce that better represents people with disabilities. We need to invest in increasing skills and education and connecting I would ask workers with jobs.
And finally we need a welcoming and inclusive work place for people with disabilities. This needs to include more effective work to attract people with disabilities, provide accessible digital and physical work environments, build an accessible supply chain and help partners with their accessible journeys. Importantly, these ambitions are bigger than one company. It is going to take all of us working together across industry, the folks that are on the panel that I'm so excited to work with. And of course, with the NFB.
And I think one of the things we can do is encourage other companies to make their commitments public and to help them achieve them.
In terms of specifics, I'll offer up an example of Windows 11. It is in development now. I think it is becoming a good example of accessible technology development practices that are required to realize our ambitions. The Windows 11 team considered accessibility from the start. People with disabilities including people who are blind and with low vision are building Windows 11. Development includes trusted conformance testing and assistive technology reviews. Early insider versions of Windows 11 are available now to screen reader and Braille device manufacturers and other partners and users for testing and to gather feedback. These processes and this open approach and early feedback before Windows 11 starts to be generally available later this year are helping us to ensure that Windows 11 accessibility is great at launch.
And finally once generally available, the disability answer desk is available to help provide help when need and track issues through resolution. There will be challenges. We're not going to get it all right. We're doing our best and we're providing help with the disability answer desk to understand what's most important and to make product improvements.
So that's a little bit of high level framing and details. For me, I'll just say it starts with being clear about our commitments and aspirations and I think the rest follows from there.
CHANCEY FLEET: Fabulous. Eve?
EVE ANDERSSON: I think a lot of what we do at Google is quite similar to what Peter and Jeff just talked about from Amazon and Microsoft. I don't want to bore anybody by repeating too much. I think I would maybe add to that logistically one of the most important things, speaking of evolution as you did, Chancey, is we've evolved a hub and spoke model at Google where we have a hub team of subject matter expertise where we create standards and tools and all kinds of things like that. But then we hire people to work on accessibility and product teams around the world, around all different teams.
By design, we have more people working outside of the hub than inside of the hub. Because at least for us, it was really important to have people who know the product well, who have influence within their product areas to be doing a lot of the actual work on their product with support from our team.
So maybe that is the one thing that Jeff and Peter didn't talk about. I would also echo that partnership has been so important both with NFB for learning what are some of the most important things to focus on as well as with our industry partners. Jeff mentioned the disability answer desk. And Google created its own because of our partnership with Microsoft. We couldn't have done it as quickly or as well as we have. That's a really key part to making sure that we're building the right things and meeting people's needs. I'll end it there.
CHANCEY FLEET: Thank you. Last but not least, Sarah.
SARAH HERRLINGER: Following a little bit about what Eve just said, a lot of what our companies do are similar. We're all working on a lot of the same ways of trying to embed people in as many places as we possibly can to provide education to engineers across our entire companies to understand more about accessibility. To get people to really understand the lived experiences of the communities that we support so that they can have the knowledge of what works and what doesn't. And a lot of that is by employing individuals in the communities in order to make sure that their voice is heard in everything that we do. There is a lot we're trying to do and working together.
One other thing I wanted to touch on that Jeff touched on a little bit when he was talking about Windows 11 so for us as well, we're in the middle of having the beta time frame for our operating systems that will be coming out in the fall. We launched them first in early June and now our public betas are out. And one of the things that's incredibly important is getting feedback from people.
One of the big elements around accessibility in general is it is about customization. When you look at all of the many features that are built into in particular, something like iOS, the accessibility settings, there is a massive number of them and the ability to be able to work cross functionally means that everybody is used to our technology as unique. Regardless of how many people we have working on a product, there's always someone who can find a bug because they're going to use it differently than any of the people who we might possibly have doing it. So we always encourage everyone to take advantage of those beta programs and actually use the tools, pound on them and come back and tell us what works and who doesn't.
Obviously out in the developer world right now, we have I don't know how many developers who are doing this for every other feature. So we encourage the blind community to download the betas so when you do get the final product, it is a better product for your personal needs.
CHANCEY FLEET: Absolutely. Thank you so much for that. So we're going to move on to our next question and in a minute, we'll hear from Sarah again. We all know the efficient use of Braille as a reading and writing medium is critical to readers and it is an absolute necessity for those of us who can't use text to speech.
But many Braille users are frustrated and impeded by lack of Braille support during some product setups and installations. And lack of support missing or inefficient commands, drop Braille characters and other behavior that might pop up when they're reading and writing. And increasingly digital world, the barriers can be discouraging.
What are your companies doing to prioritize Braille support? If you feel there are systemic issues with Braille that are outside the company's control, what do we need to know as advocates and as an industry to make using Braille with devices a first class experience for everyone? And Sarah, I know you've been doing a lot of work with digital Braille so I'm going to lead with you on this one.
SARAH HERRLINGER: As we have put voice over into more of our products, Braille is something we've tried to support along the way. It is not just audible readout but being able to use over 80 models of displays on iOS and 100 models on Mac iOS and about 80 on TVOS and across the board.
[Changing captioners]
And one of the things worth noting with that is when you think about some of those Braille displays, we have Braille displays we're supporting that are almost 20 years old for which the companies aren't even in business anymore. So, you know, maintaining functionality actually can be quite complex. And so we have full-time engineering MQA devoted strictly to Braille just because of all of the many options we have and the many ways that we've tried to implement Braille across all of our different products.
One of the things I think is incredibly important for people to do is to support the Braille HIDP that was developed a few years ago. For anyone not familiar, we spearheaded a protocol --
CHANCEY FLEET: I'm going to be the acronym police.
SARAH HERRLINGER: Human interface device protocol, HIDP, that basically says if you follow this standard, whether you're a Braille display manufacturer or computer or phone manufacturer or screen reader or whatever it might be, we all agree this is the format, we're all going to use it, and that will make it so that this Braille display will work regardless of whether it's on an Apple product or Microsoft product or Google product, this is the way it's going to work.
And we started this process in 2017 in CSUN at a meeting that went on for hours in a conference room with everybody from all of us to Braille makers, to screen reader makers, you name it, all sitting there figuring out every possible thing that needed to be covered, and things that we just postulated might come into the future to try and come up with this HIDP. So for Apple, we wanted Braille manufacturers to test on something before they made their displays and now we're seeing more and more of them come out. But I think it's important to encourage more people to embrace this HIDP and use it because that will just make it easy for everybody, and end of the day, it makes it easiest for the community itself to just be able to know that their Braille display is going to work with whatever they want it to work with.
CHANCEY FLEET: All right.
Next up let's hear from Eve.
EVE ANDERSSON: We also at Google have people full time devoted to Braille. It's a complicated area. We have so many different platforms in our companies and different types of software. I hope that all of you who use Google products and Android have been seeing the progress over the past few years. I think we started out from a position of weakness, but now we don't need to use a separate device for Braille output, and that can be more efficient for somebody who wants to quickly type something, and then just improving how Braille works in Google docs, for example, to jump to comments and just other commands that people use regularly.
I agree that we really want more to have more manufacturers to adopt HIDP. There's so many different protocols and that's giving our engineers a hard time.
Another thing giving them a hard time is translation tables. You know, there are just a lot of different characters and ways of doing things, and some of these open source tables are buggy. So that's another area that's just been really hard, because we do have a lot of people who use Braille in their daily lives at work, but not who speak every single language. So I think as an industry, it would be helpful to do better there.
And maybe if Braille display makers could standardize some of the buttons a little bit more, that would also be helpful for the people who develop software. So I guess that's my main wish list there.
CHANCEY FLEET: It would also be helpful for trainers and users for sure.
So I want to take a mild risk here and ask a question that we didn't script. And if it's not top of mind for you, the fault is mine and not y'all's. Do you have any sense what plurality of Braille display manufacturers are supporting HIDP right now?
SARAH HERRLINGER: I don't know. It's not a ton as of yet. We're starting to see a handful of them that are on the market. And I would say more that are having conversations and starting to ask about it. But I think some of the struggle we've this is there are probably people who have made displays in the interim between 2018 when this was put out to the world and now. For some of us, we can't support devices and I think more and more are starting to realize we need to adopt this thing quickly and hopefully soon.
CHANCEY FLEET: That sounds like a great place to advocate for Braille display manufacturers.
Let's hear from Peter about Braille.
PETER KORN: Braille is near and dear to our heart. We believe it's critical for the screen readers on the devices we carry to not only be compatible with digital Braille, but take the fullest advantage of it. So our philosophy is to maximize the precious and expensive cells on your Fire Tablet through the voice view screen reader, rendering as much as possible on a single line, using dot7 or 8 to indicate which of multiple words has focus, and similarly, we developed Braille blips to indicate roles. And I can take zero credit for this. This came from our senior software engineer, a blind gentleman Mark McKahey who developed voice view and another blind engineer who developed the blips. And they had a profound influence on our development team. One of our sighted engineers, Bryce Thobold, tasked with implementing Braille under Mark's direction, taught himself to read by touch and read on his commutes from San Francisco by train.
As far as what we can do outside of our control and some of the key issues, I think the fundamental issue that I see is the cost of the devices. We're very excited about efforts to deliver substantially lower cost displays. Voice view is among the first screen reader to support the orbit reader 20 and we're working with manufacturers to add support for their forthcoming low cost displays. And the other thing that we did, Eve mentioned this earlier about open source Braille tables. We work with John Gardener to update the open source license to LeBluie to one that companies could adopt so that we could share improvements with each other in those display tables and in those libraries for the legacy systems. And I think that's one of the areas that these companies can collaborate on in open source. We did some work with Facebook on reactnative improvements which is helping in other areas.
So those are examples of places where I think we can and do work together to improve these things.
CHANCEY FLEET: Awesome.
Last but not least. Jeff.
JEFF PETTY: I'll keep it brief. I just want to recognize, I'll double down, frankly, on Sarah's and Eve's comments, and Peter's. You know, standardizing the way that Braille displays communicate with operating systems like Windows means that it's really, we can manage different screen readers' hand off of Braille displays with higher quality because we have these standards. It enables things like plug n play support. It enables things like independent set up with Braille. So there's so many benefits to users that are unlocked. In addition to manufacturers and, you know, folks who own operating systems because it makes it easier for all of us to bring new Braille experiences to market by standardizing the way that we handle these devices. So I can't say enough about that, and I can't say enough about the partnership with Apple and Google. I recall the meeting at CSUN well, and it's exciting the progress that we've made since then.
And again, I'll just double down on Peter's comment. Peeve we've benefited from the work that you've done, Peter, having a less restrictive license with TTY and LeBluie so we can bring those open sources into Windows. I will just say, I'll close by saying, of course we've supported Braille with third party screen readers on Windows for years. We just brought Braille to narrator recently in Windows 10. And we are continuing to prioritize Braille in Windows 11 and going forward. And frankly we wouldn't be able to make the progress that we're making without partners like Apple, Google, and Amazon.
CHANCEY FLEET: Great. Thank you.
All right. We've got two more questions.
SARAH HERRLINGER: Real quick, one thing. Just because it's great to be able to get info on the fly, it sounds like the HPH mantis, the chameleon, and the HumanWare Brailleian and possibly one from innovation are all using the HIDP right now. You probably want to check, but I'm hearing those are all ones that work.
CHANCEY FLEET: All right. We've got some Googling to do.
How can our movement and the tech industry work together to ensure that automation for accessibility is used in measured and ethical ways? So there's a tension between the promise of automating some aspects of accessibility using machine learning, computer vision, and artificial intelligence, and on the other hand, the reality that wherever the future takes us, right now automated accessibility sometimes falls short or falsifies information or ought right fails. So I'm wondering, how does each of your companies consider user choice and whether to use automation, and the risks posed by automated misinformation and bias when you're designing AI powered accessibility features? For this one, we're going to lead with Jeff.
JEFF PETTY: Thank you, Chancey.
I'll kind of zoom out and then zoom in. I'll respond to AI in general, then in response to accessibility, and in particular some of the efforts we're undertaking to improve the experiences that we're delivering and address some of the concerns that you raise.
So first and foremost, at Microsoft we believe in the potential of AI to improve our lives in big and small ways, and we need to make sure it benefits everyone. For the first time we are taking machines and asking them to perform roles that humans have typically performed. And we recognize that when we do that, we can have unintended effects on people in society. So we have developed responsible AI which is a principle-based approach to develop and deploy technology to empower everyone, and it includes six key principles: Reliability and safety; fairness; privacy and security; inclusiveness; transparency; and accountability.
And I'll talk about the last three in particular. AI systems should empower everyone and engage people. AI systems should be understandable. And people should be accountable for their AI systems.
So I think some of these principles speak directly to some of the challenges that you've raised. Microsoft's been an early leader with experiences in AI with experiences like seeing AI and we were quick to recognize both opportunities and challenges like reducing bias by training AI with better, more representative data. And what do I mean by bias? I mean when you get an inaccurate result or a result that skewed to the data used to train the AI model.
We launched an AI for accessibility program to influence the future of technology to ensure global independence and inclusion in society in four areas of focus: Human, community, education, and employment. And as a part of that AI for accessibility program, we're working to raise awareness and address challenges associated with disability bias in AI.
So let me be clear. This is one of the places where we need to create data by and with people with disabilities to make sure that the data used to train the models results in accurate and safe experiences. And we're working with a bunch of organizations to develop more inclusive data sets. There are lots of examples. We're working with the city university in London on what's called an orbit data set. AI for accessibility project is enlisting the help of people who are blind or with low vision to build a new data set with enhanced computer vision and object recognition to better identify specific personal items like a set of keys or a face mask or a friend's front door. And this is an example of how we're investing to specifically improve experiences like seeing AI so that you can rely on it in more scenarios and not use it as an optional support but to increase your reliance on it safely with better inputs.
I've got a bunch of other examples, but I will pause just to make sure that other folks have time to contribute. I just, again, in summary, we've got a principle-based approach, we have an AI for accessibility specific program to implement those principles, and then we're doing deep work with partners in order to deliver real impact.
CHANCEY FLEET: Thank you.
So next up, let's hear from Sarah.
SARAH HERRLINGER: Yeah. You know, I think it's kind of interesting with this question in particular around the element of like sometimes it fails or falls short, because it should also be noted, this is an issue that goes on outside of just the blind community in sort of everything with ML. When you look at even some of the circumstances going on with self-driving cars and the stories that come out of a person that just let it run that goes awry, that certainly machine learning has a ways to go for everyone.
With that being said, and machine learning is not perfect because it's the same as if you ask a human, for example, what someone else next to them is eating, they may get that right or they may get it wrong. It's not like we all even in other parts of the world, that everybody gets everything perfectly. So they may guess and we need to continue to build and figure out how to get higher competence in the work that's being done.
I think one of the big things for us at Apple is responsible AI is a huge topic for us. And we're focusing on it in a lot of different ways. It's everything from the amount of choice we give users right from the start to say, you could use this or not use this. We're not going to force it upon you. And if you choose to use it, we will be very up front that it may not do everything right. So there are times when we're saying, don't count on this for everything out there. It is possible it is going to give you a false positive.
And I think that's incredibly important to be up front with people and let them know, this is not suddenly solving all problems in the world. It's solving some. And it continues to get better because one of the things with machine learning is we continue to change the models and make them better and improve, but once again, nothing is perfect.
But with that being said, a lot of what gets us to better models is inclusion. And getting more people into the rooms to talk about what machine learning should be all about and making sure that when we're doing this, we're doing it in incredibly thoughtful and ethical ways that gets as many different voices as we can in there. So when I look at something like for Apple face ID. When we started working on face ID, one of the things that was incredibly important to us was to make sure that from the start as we were looking at every possible facial element to do face ID, it was making sure we had atypical facial structures, making sure we had prosthetics, making sure we had all these different things that got included in that model so that it wasn't just a one specific type of face that we were able to get. And genders, races, skin color, all of it.
But a big piece of that was looking at accessibility as a part of this process.
CHANCEY FLEET: Awesome. Thank you. Eve?
EVE ANDERSSON: I have to agree a lot with what Sarah and Jeff just said about the importance of making sure that the AI development is inclusive and also taking a principle-based approach. Those are two of the tactics we've been employing. And we do have to be honest about when it might not be right, as Sarah said.
So for example, we use AI in Chrome as an optional setting that people can turn on to identify images that a developer hasn't labeled. But to make it clear that this is a guess. The system says "appears to be a woman eating pizza" as opposed to making it indistinguishable. Best practice of course is for a developer to label it themselves.
So that's what we do.
In contrast, what I would say, some of these overlay companies make claims that a website can be made accessible with an overlay, that is absolutely far from the truth. I don't know if the world will ever get there. That's my advice: Be humble and admit when something is a guess.
CHANCEY FLEET: Wonderful. Thank you.
And lastly, Peter.
PETER KORN: Echoing everything that was said, a nice example is our show and tell feature. Show and tell if you're not familiar with it, you go to an Echo Show device, an Echo with a camera, hold something up and ask, Alexa, what am I holding.
And we run multiple recognizers simultaneously. And I've just invoked her, of course. (Alexa speaking in the background ).
We run a recognizer against the product database that we have because anytime we sell something in the store, we take pictures of it from all angles. We run a logo recognizer, we run a text recognizer. And depending upon how good the image match or logo match comes back, we will say, this appears to be such and such, the text I can see is so on. Just, again, to be clear and be honest and up front with the quality and caliber of the machine learning results. And that was, again, developed by Dr. Joshua Meally, that user interface. And the on device computer vision to help the customer keep the image in view of the camera.
Beyond show and tell, we've started experimenting with automatic generation of alt text for the retail website. We got great volunteers from the NFB last year who participated in that research. The work is early, but it's very promising, being able to describe the length of address, the shape of a hem line, the shape of a shoe, of course colors.
And again, taking care to make sure that it offers substantial improvement to the online shopping experience before putting it into play while at the same time pushing the folks listing in our stores to properly put in good alt text so that we don't have to fall back on machine learning.
CHANCEY FLEET: Thank you.
And we are on to our last question. So we all know that blind people at this moment are underrepresented in the technology industry, especially outside the accessibility field. So what do each of you think, what steps can leaders in our movement and in the industry take to ensure that blind talent gets cultivated, hired, and retained? And for this one, I'll start with Eve.
EVE ANDERSSON: Thanks, Chancey. This is such an important question. And I like that you put in the part of outside of the accessibility field because I think people make assumptions about what department people want to work in because of a disability, and that should be true.
And sometimes I find that candidates are even self-limiting. They feel that they should work in accessibility. And I hope that nobody here at this convention feels that way, but if you are, please, just know that any department, any role is open to you.
So I'll talk about some of the things that Google is doing in this area that I think would be considered some of the best practices. One of them of course is in making sure that candidates have a good experience. I know that Google has a reputation for having engineering candidates write code on white boards, which probably sounds a little bit daunting to some people, but we have an entire candidate accommodations team that makes sure that people can use whatever technology works best for them. We've had people use all kinds of technologies to write things and to interact with interviewers. And we've given people extra time and support staff. And whatever people need. We don't want the interview process to ever be a barrier for any applicants.
Once employees join, it is important for them to feel comfortable. That's technology. Getting the technology they need set up. Making sure they know about our employee resource groups and accommodations and other resources available to people.
We have a mentoring program to help new and experienced people in the company with disabilities. It's a special program where they can be matched up with a more senior mentor. I'm a mentor myself.
Of course all of the accommodations that companies offer. Visual care assistance, technology. Our employees, to do their jobs, they are using of course the same technology we offer to the public like Google Docs and Gmail and all of these things, but also we build a lot of our own internal software systems that are used. So we apply the same standards to making sure that those are accessible also.
We have a web page in our Google career site if you want to learn more. I won't read out a long URL because the best way to find things is always by searching. So if you search for Google careers disability on your favorite search engine, it will pop up.
CHANCEY FLEET: All right. Thank you so much.
Peter?
PETER BLANCK: You know, I've been at Amazon. Our journey is not quite 8 years old now. Our continuous accessibility journey. We are working to make up for the lost time. I feel I'm among giants of decades-long programs at places like Apple and Microsoft.
And one of the most critical things we did at the start of that journey is create the Amazon PWD affinity group, Amazon persons with disabilities, who self-advocate and drive through a voice of the customer program the people accessibility organization, another of our multiple dedicated accessibility teams. This one in HR is defining accessibility as equally efficient, effective, and a delightful experience as anyone else driving employee and candidate experiences. Training teams, developing training for managers. A third party accessibility program to drive the kinds of third party products whether it's sales force, other ERP products.
Trainings we have outside companies make to a high standard of accessibility.
As far as advice and guidance to others, you know, every organization is going to have its own path to and structure to embedding accessibility in their company. But I think there's sort of four key principles that we've found: Work backwards from the customer. Where are the biggest pain points. How will you make the experience equitable and not just simply technically accessible. Work with your affinity group of people with disabilities to define that.
Understand the product mix of the experience. Are you mostly developing internal tools? Are you mostly purchasing externally? If because you're going to have different mechanisms depending upon which you're doing.
Consider the end to end experience. Joining, working, even leaving your organization. All of that should be accessible.
And finally, establish goals, launch reviews, and invest in a dedicated team to drive this. This is models that we've seen that are successful for customer-facing initiatives as well.
CHANCEY FLEET: Thank you.
Sarah?
SARAH HERRLINGER: Yeah. I feel like this panel keeps turning into these, yeah, what they said, because we all echo a lot of what each other -- you know, we're all doing a lot of the same things and I think that is because we're all trying really hard to be leaders in these areas and do all that we can.
I think from an Apple perspective, once again, I would echo some of what has been said before. It starts with recruiting, making sure that people know that any job is available to them within the company, and making sure that that is true. That in the recruiting process people are treated with dignity and respect. And that there is training and support that goes on throughout their career within the family of Apple.
And we have our own diversity network association for accessibility. We have a lot of ways that people can engage in this area. But I think one of the other things that's important to do is really when you're looking at whether areas are accessible, whether jobs are accessible, to really dig in within organizations and figure that out. As an example, one of the things we did a while ago now -- I've been at Apple for 18 years so some things feel like yesterday but they were 15 years ago. But we were looking at trying to bring in more members of the blind community as part of our customer service program. And so we actually took a couple of years to go forensically through every single touch point that somebody in a call center would have in order to be able to do their job and effectively work with customers. And every time we got to a point that we thought we got this, we would find another point. We never wanted someone left on the phone with a customer saying, I can't do this because it's not accessible to me. So it's important to take that seriously and really make sure that if you're going to hire someone in for a job, that their job can be done.
And in doing that, that went beyond what are the programs and systems into the facilities, down to Brailling all of the vending machines. And everything that we could do to make sure that this experience was one where it was of equal ability within the call center experience.
So I think organizations really have to strongly look at themselves and make sure when they're doing this that they don't think, oh, great, let's just hire someone, and not do the due diligence around is that person going to be able to do their job effectively.
And I think one of the other things Peter touched on a little bit is third party software. We are all building our own, both for customer facing and internal work, but there are also a lot of third party software elements that we use as huge companies. And I know for all of us, we're all I think working with our partners to say please make sure that your stuff is accessible. But I think some of the organizations when you talk about how can the blind community help drive this, to make sure that they're ensuring that some of these major companies that are making finance software and things like that, that those are accessible as well.
And the last thing I think is making sure as leaders in the movement, making sure that the community is prepared for interviews and things. For anyone. When I go talk to colleges I say make sure you're doing mock interviews. So this isn't something I'm saying just for the blind community. I think it's important for everyone to do mock interviews, to be prepared, and to go in and put their best foot forward so that anybody who sees them sees them for the great talent that they are.
CHANCEY FLEET: Absolutely.
Jeff, take us to the finish line.
JEFF PETTY: Again, I'll just reiterate, we are doing so many similar things.
I will say, you know, I kind of said up front our ambition is to help cross the digital divide and we think it's going to take bold investments in accessible technology, in the workforce, in skilling, and in the workplace. And though we've had a longstanding program, we're humble, we approach it with a growth mindset, and just as recently as these past couple of years, I can think about friends and colleagues that were advocating to make sure that their guide dogs were going to be welcome in transportation across campus. I have a friend who helped create a bereavement policy for guide dogs because we recognized the special importance they have for their users.
I would just say that our journey, though it's been ongoing, it continues and we're continuing to learn. You have to be deliberate as Eve and Peter and Sarah have all said. You just have to be. You have to be committed and deliberate in how you go about it to effect change at scale.
Yeah.
CHANCEY FLEET: All right. It has been such a privilege to work with you folks. You have an approach to your work in accessibility that is sincere, rigorous, and ambitious. And you all think about accessibility in a way that transcends each much your individual companies. And I just want to thank you again for taking the time to spend with us.
Thank you for being here.
EVE ANDERSSON: Thank you for having us.
PETER KORN: Thank you so much.
Thank you, President Riccobono, for the invitation.
MARK RICCOBONO: Yeah, thank you to Chancey and to all of our panelists for your participation this evening.