Melissa Green: Good evening allies and happy Global Accessibility Awareness Day. Today marks the 12th annual Global Accessibility Awareness Day or GAAD. This annual event held every year on the third Thursday in May, aims to get people thinking, talking, and doing, around digital inclusion and accessibility for people with disabilities.
We're so glad that you decided to celebrate with us by joining today's session. If you'd like to learn more about Global Accessibility Awareness Day, I encourage you to check out their website. The URL is https://accessibility.day. That's accessibility dot D-A-Y, as in awareness day.
I'm Melissa Green, your host for this evening. I'm a member of Knowbility's Accessibility Services team, which provides accessibility audits and testing. I'm a certified professional in accessibility core competencies or CPACC. That's a professional certification awarded by the International Association of Accessibility Professionals. This topic is particularly close to my heart, because I also studied education at George Mason University and have a master's in education with a concentration in Assistive Technology.
I came to accessibility by way of disability services, libraries and higher ed. I also live with bipolar disorder, which informs my thinking and my work around accessibility and disability. Knowbility is a nonprofit organization based in Austin, Texas and an award-winning leader in accessible information technology.
Our mission, is to create a more inclusive digital world for people with disabilities. We help make the internet and other technologies more accessible to people, who are blind or low vision, deaf or hard of hearing, have mobility limitations or have cognitive or learning disabilities. Our community programs and advocacy work is supported by generous donors just like you.
If you feel compelled to give, I hope you will, at knowbility.org/donate. Our community programs include the Accessibility Internet Rally or AIR, which teaches web pros, how to design accessible websites and apps, in a practical hands-on setting. Company or independent teams are trained and mentored while working with a nonprofit, artist or community organization, to build an accessible website.
It's a fun hackathon/competition and registration for AIR 2023 is now open. You can find more info about that on our website. John Slatin AccessU, is an annual conference where tech professionals, content creators, policy makers, and advocates come together for deep learning and accessible digital design. We just wrapped AccessU 2023. Last week we had a great hybrid event in person, in Austin and online.
Knowbility helps teachers, students, and parents navigate the accessibility challenges, that can come with a digital learning environment. We have resources for parents, Teachers, and Administrators, who want to learn how they can develop accessible digital programming for a K-12 audience and finally, be a digital ally. The program in which you're participating today, is our monthly webinar series that covers the basic skills and principles of accessible digital design. It's free and it's meant for people who regularly interact with and create digital content, but may be new to accessibility.
During today's session, we'll start our two-part exploration of how people with disabilities access digital content, by watching and listening to a panel discussion that took place at Knowbility AccessU conference last week. In this session, several Assistive Technology users from the Texas Workforce Commission, answer questions from the audience about using Assistive Technologies such as screen readers, screen magnifiers, and other hardware, software and strategies with websites, documents, and applications.
Before we hear from the Assistive Technology users though, I want to make sure everyone has a shared understanding of some of the technologies and strategies the panelists will discuss. What is Assistive Technology or AT? The Individuals with Disabilities Education Act or IDEA defines Assistive Technology as, any item, piece of equipment or product system, whether acquired commercially, modified or customized, that is used to increase, maintain, or improve the functional capabilities of individuals with disabilities.
In other words, Assistive Technology, is technology used to perform tasks that would otherwise be difficult or impossible. Along with the hardware and software that enable people with disabilities to access digital content, Assistive Technology also includes things like mobility devices, walkers, canes, communication systems and devices that support activities of daily living, like grab bars or bath chairs, tools to help pick up objects and so on. Some folks would consider eyeglasses an Assistive Technology, because they enable you to do something you would otherwise not be able to do or have trouble doing without them, seeing. Under that definition, the umbrella is pretty large. Today, people interact with technology using thousands of devices. These devices include Assistive Technologies or Ats, that allow individuals with disabilities to engage with mainstream AT.
People who access AT, include individuals who are blind and use audible or tactile output, like a screen reader that reads digital content using a synthesized voice and refreshable braille devices, individuals with learning disabilities such as dyslexia, who use text to speech technologies that read aloud text while visually highlighting each word, individuals with low vision, who enlarge default fonts or use screen magnification software that allows them to zoom into the screen.
Individuals with fine motor impairments who use Assistive Technologies such as speech recognition, head pointers, mouth sticks or eye gaze tracking systems, individuals in a noisy or noise free environment who are deaf or hard of hearing and therefore depend on captions to access and understand content and individuals who use mobile smartphones, tablets or other devices, which have a variety of screen sizes as well as gestures or other user interfaces for interacting with the device and the content.
So many, many options available and as we'll hear from the users today, not everyone prefers the same technology, just like some folks prefer an iPhone and others prefer Android, some AT users might prefer to have text read aloud, others might want to magnify it, so that they are able to read it. We'll hear the variety in the technologies and strategies in our panel today. Now that we know what Assistive Technology is, let's hear from the panelists.
I'll be playing a portion of the panel discussion, sharing my screen and sound. If you have questions or comments, please share them in the chat as they come to mind. Let's make this an active discussion, as we watch together. After the video, there will also be an opportunity to ask and answer questions using the microphone. We're using Zoom's automatic captioning feature today. If automatic captions do not meet your access needs, please let us know when you register for future sessions and we'll make sure to provide what's needed, for you to fully participate. With that being said, I'm going to go ahead and get the video started. One moment.
Speaker 2: Yes, we can hear. Thank you.
Melissa Green: Perfect. Thank you very much.
Speaker 2: Take it away Joanna.
Joanna Blackwell: Good morning everybody, and welcome to the Ask an AT User panel. My name is Joanna Blackwell and I'm an Accessibility Specialist with the Texas Workforce Commission. This morning I'm joined by a talented group of Assistive Technology users, who also work at the Texas Workforce Commission and joined by my coworker Christina Miranda. Chris?
Chris Miranda: Hi, good morning everybody. My name is Chris Miranda, I'm also an Accessibility Specialist and I am the Curriculum Developer and Training Specialist and we have a panel of five people here and we're going to start with George and introduce yourself. I'm going to pass the mic to you, George.
George: Good morning. My name is George [inaudible]. I'm a Vocational Rehab Teacher. Been with the agency for about 17 and a half years. I come up here from Houston, Texas, which is in region five of the Texas Workforce Commission.
Ernesto Sifuent: Hello, my name's Ernesto Sifuentes. I'm a Web Accessibility Specialist for Texas Workforce Commission and I'll pass it on to...
Belinda Lane: All right. Hello, I'm Belinda Lane. I am in the Vocational Rehabilitation Division of the Texas Workforce Commission. I'm the Vocational Rehabilitation Teacher and Braille Program Specialist, and I work with our Teachers, in order to train our customers, to learn their skills and be able to get jobs of their choosing and live independent lives.
Andrew De Avila: Good morning. My name's Andrew De Avila. I also work with Texas Workforce Commission, but under that umbrella, I work at the Criss Cole Rehabilitation Center for the Blind, and I'm an Assistive Technology Instructor and just recently got my master's in Assistive Technology from Northern Arizona and we teach adults who are new to blindness, all the skills to become independent, and I teach the technology portion of that.
Kevin Ratliff: My name is Kevin Ratliff and I also work for TWC and for Criss Cole. I've been an Assistive Technology Instructor at Criss Cole, for 10 years.
Joanna Blackwell: And this is Joanna Blackwell again. I like I said, have worked at Texas Workforce Commission as an Accessibility Specialist and I actually do have one other question for everybody here on the panel, and I guess I'll go ahead and get started and that's about what Assistive Technologies you use in your personal and in your work life.
For those of you in the audience who can see, you may notice that I have Apple Air Pods in my ears and I have an iPhone in front of me. That's because I'm using the voiceover screen reader, to read some of the questions that we have for the panelists today. In addition to using voiceover on iPhone, I use it on Mac Os and I also use the JAWS Screen reader, the Narrator screen reader, and the MVDA screen reader. There are a couple that screen readers that have gone the way of the dodo, that I don't use anymore, but I've been around in the AT world for quite a while. That's what I use, when it comes to Assistive Technology and I'm going to hand the microphone back over to Kevin.
Kevin Ratliff: Okay, this is Kevin Ratliff and I have my Samsung computer here. It's a mainstream device, but I use JAWS and Narrator on it. I have various Android devices. I use Pixel 6Pro for my daily driver and I also have Samsung Galaxy S22 for work and personal, and I have all of their accessories, watches, headphones, and let's see what else do I... And I use Talkback on my Google and Samsung phone and I use Chromebox on my Chromebook, a Pixel BookGo that I use daily or almost daily and I think that's just about all the Assistive Technology that I have.
Andrew De Avila: Good morning, this is Andrew De Avila and I'm actually opposite of Kevin. Kevin's our Android guy at the center and I use everything Apple, Mac Os, iPadOS, and I also use JAWS, Job Access with Speech, NVDA Narrator as well. I'm currently going for my PhD. I have to jump between different platforms, to see which one works better with either Mac Os or with Windows, but Kevin and I being instructors, teaching people coming into the center, we have to keep up with the latest technology and everybody wants to learn the latest trends and how to become YouTubers or podcasters. Kevin and I are part of a large group, where we have to keep up with everything and test everything. That's pretty much it, I guess. Belinda.
Belinda Lane: Thanks. It's Belinda again and I use a lot of the similar things that Kevin and Joanne and Andrew do. On my computer, I use the JAWS screen reader and Narrator, in a pinch. I have Apple phone products. I use an Apple Watch, I use the voiceover on both products. Let's see, I use a Braille note taker and Braille displays. I have one to show and tell later. Y'all can come up and look after the talk. Let's see.
In the workplace, I use all of those at home. I also use JAWS on my computer. I have a Surface that I use and anything that has a screen reader on it, I will use. I have a TiVo for my television, that way I can read the TV guide and access all of the recordings I have. Let's see, what else?
Anything that will talk, I use. I have a reader for medication. There are readers for money, but I use the phone, many, many phone apps I use, to access different things. AIRA, I use for its live description so to speak. If I need to set my washer and dryer, I can call them up real quick and say, "Hey, what is this set at?" Directions, I know George got directions using AIRA to get to this building. Many other things and I'm sure George and Ernesto will tap on those too.
Ernesto Sifuent: Hi, Ernesto again. Work wise, I use Windows with JAWS and Windows Magnifier. I find that basically, I can do all my testing for my work. For play, I usually use the iPhone or Apple products with voiceover or screen magnifier. George.
Melissa Green: Sorry, the picture has cut out for a moment. I apologize for that.
George: [inaudible], but anyway. I also use an embosser when I'm at the office, because I'm a Braille reader primarily, especially I tend to retain information if I read it with my fingertips, than listening to it with my ears. I use it as a comparison too, as I notice a lot of people with 2020 vision, if they have to read a lot, they will print it, as I guess they read better when something is on paper than staring at a visual screen.
That's why I tend to, if I've got to read a lot and I want to memorize it a lot, I will send it to the Embosser and Emboss it onto a hard copy and just like my note taker, it has a Braille display and I use that a lot and lastly, I also use, and I'm sure everybody uses this, these smart speakers, when you're talking to your Alexa or somebody might have an Alexa at home.
Now I've just activated it. I use the Amazon Echo devices, as well as the Google Play thing on a daily basis and besides AIRA, like Belinda mentioned, I would also use Be My Eyes, I use everything. Whatever's available to me, I'm going to use it and since I am a Teacher, I also introduce and teach these things as well [inaudible] how they can use it, along with the various amounts of apps that are available. I'm done, I don't know who...
Chris Miranda: Okay, thank you. Thank you George. This is Chris Miranda again. What we're going to do, is we've got some question stuff we pre-prepared for the panelists, but what we really want to do is have you guys ask the questions, anything... This is your chance to ask a user of Assistive Technology, any questions that you have, that you might want some insight into how they use either an app or a program or the assistive tech. I'm just going to start off with the first question and the panelists will raise their hands if they want to answer it and then I'll just call on them. Okay. The first question is, can you share an unexpected way that mobile or web technology has facilitated your independence or enhanced your daily experience? I see that hand going up Kevin. It's all yours. Here you go.
Kevin Ratliff: Mine, is because this question is, just seeing where Assistive Technology has come from, imagining what I was going to do to live independently, how I was going to pay bills, and then to see those technologies come to fruition, to be able to pay bills online, to be able to write checks through online banking, to be able to do so many things, all the way up to AIRA and Be My Eyes and Lookout, which is computer based Assistive Technology, but just to go from thinking, being totally blind as a kid, how am I going to do these things independently and then the technology just came along, right at the right time.
Chris Miranda: All right. Joanna.
Joanna Blackwell: Thanks Chris. I'm sure everyone here has heard of ChatGPT and I'm calling this a web application, because it's a website and it's incredible in what it can do, when it comes to describing images. I oftentimes, I'm a reader, I love to read audiobooks. I was formerly sighted, so I used to read print books, but now that I don't do that, I can read audiobooks, but I remember loving just to look at the book covers and I ask ChatGPT to describe a book cover to me or I've asked it to describe Actors.
I'm just curious about what they look like. This is something, I could see it helping in education, but I'm not taking any university classes right now, but I could see it being useful in that. But just for me to be able to ask it to describe a book cover and get an answer, is pretty incredible and has really enhanced my enjoyment of things that I like to do in my personal life and I have used ChatGPT in my work life too, not for it to do my work, but sometimes to give me a different angle, to think about how I might want to approach a certain topic or a subject. I found it really helpful in that regard.
Belinda Lane: And this is Belinda and to add to that, the enjoyment of entertainment, I really, really love the audio description. Audio description is movies, TV shows, and it's somebody describing the visual aspects that we don't see and this truly is an enhancement. You get to have those things, like say somebody receives a letter and they read it, but just silently and all that you see is the letter on the screen.
The audio describer will read the letter, so you know what happened and just little subtle things. One of the movies I watched a long time ago, that really showed me the benefit and the impact it has, is Unbreakable, I think is the name of the movie with Bruce Willis. Yes. Excellent and what happened was, in the beginning, Bruce's character is getting on the train or the subway and he steps on there and he sees an empty seat next to an attractive woman.
He takes off his wedding ring and goes and sits down next to her and do you know how devastating that was to me? Because, I love Bruce Willis and I thought, "What a cad." You get to know these things and you're not just going around with your rose colored glasses. Oh, he's so wonderful. But anyway, I love that and there's even live description, the coronation over the weekend was described, so you can know that they may describe, "Oh, the crown has these jewels, blah blah blah." It's really neat and I almost... Well, I must confess, I do steer away from movies and things that aren't described, because it's like, "Oh, what happened?" I save those to hang out with friends to watch, and otherwise I do that. It's really great.
Chris Miranda: Andrew.
Andrew De Avila: This is Andrew again. I would also agree with everyone here. It's definitely changed my life. I was diagnosed with Retinitis Pigmentosa my senior year of high school and I did struggle. I couldn't see the screens, I couldn't see text, but when the Apple iPhone came out with a voiceover, it definitely changed my life, because it allowed me to meet my wife. I spoke to her in person, but I was able to text her, email her, send her pictures. I jokingly tell her I lost my sight but not my game, but I was able to do that with voiceover and I didn't do very well my senior year in high school, but it's allowed me to go back to college, like Joanna was saying, with ChatGPT, I took a doctoral course, I just finished yesterday and it was an applied behavioral analysis, single subject design class.
And I used ChatGPT to describe these inaccessible charts and graphs, where there weren't any plot points, but it was a wavy or squiggly line and I'd upload the picture, ask it questions and it would describe it to me and I didn't have to rely on someone with sight, lik my wife or just anyone just having to just sit next to me and describe it, when I just could sit on my laptop and have this, do this for me and technology, it's coming a long way and a lot of people are scared of it, but it's actually becoming more accessible for people with visual impairments.
There's a person who took a picture of the inside of the refrigerator and asked it, can you make me a dinner recipe with these ingredients? And sure enough, it gave this person a dinner recipe with the stuff that's in his fridge and that can definitely help someone like myself or people on the panel up here, because sometimes we're having to do the smell test or look at things or what is this? And technology's helping us out in education, employment. I'm actually excited to see where this goes.
Chris Miranda: What do you think George? You wanted to add? No, pass it down to George please, Andrew. [inaudible].
George: I basically just want to give a little bit more insight on what Belinda was talking about, with the descriptive video, because movies are my hobby. Every once in a while, I like to escape into someone else's imagination, with all of the stuff that we deal with on a regular basis. That's what I call it. I enjoy movies and back in I would say maybe 30 years ago, AMC was one of the first chain of theaters that I knew had offered this type of service.
But, it was only available a week later after the movie had come out and only on one screen and we used to have to call the Manager and make reservations to ask him, can this movie be put in the descriptive video house? Because, every movie didn't have that available. Well, since as time has passed now, almost all the national chains offer the service, to AMC, to Cinemark to Regal and some of those I can't think of right now and because of technology, they now have incorporated into the digital system, that when a movie is first released, not only is it available with descriptive video now, it means when a movie first comes out, like I think what just came out? Not Marvel, it's another one. Oh, I cannot think of it. I just saw the thing. That's terrible.
Oh. Well anyway, it'll come to me when I put the mic down. But anyway, it allowed me to take my 15-year-old to that movie. I don't have to wait to make a reservation and the devices are available on multiple screens. There's some theaters, they have to program it for that particular house. This one theater that I go to now, they just have a headset that is equipped with both hard of hearing, as well as descriptive video. You could turn one number up to get the description and one number up in case you're hard of hearing and I can go into any house, I don't have to go back, program this device to this movie. They have it available to whatever house that you are in.
Whatever movie you want to see, it will be available with that and it also is now available if anybody, I'm sure a lot of us have Netflix, that if you go to the language selection, you can choose audio description right there and you still have your visual, but now you can also get the description. That's with everybody. I actually want to challenge those of you in the audience and those who are watching online. If you have Netflix, just give it a try and see what it's like. Close your eyes and see what the acception of descriptive video has now allowed us to say, those who don't have vision, yes, we also watch movies.
Belinda Lane: I just wanted to make a comment on that too. It's that inclusivity. It really makes you feel great, that you can just go like anyone else and see the movie, at the same time as anyone else and have what you need, to enjoy it to its fullest. I know similar to George, our Regal Cinema here would show the descriptive video. You either had to go to the matinee, the very first one or the last one, and it's like, "I don't want to go there with all the kids and that's beyond my bedtime." It's great. It's offered at every showing of the movie, all day.
Speaker 10: Guardians and Beyond. [inaudible].
Chris Miranda: Thank you you guys. This is Chris again and just wanted to give the audience a chance to ask any questions to the panel. If you guys... Here we go.
Speaker 2: And I'll also note that for online users, if you go to the reactions button and raise your hand, we'll unmute you and allow you to ask your question, as well.
Speaker 11: Thank you. Hello. It's on? The question is about the apps and whatnot, that you rely on. You've mentioned many pieces of software. Some are very high profile like screen readers, but some are more niche, like a custom app to read medication or money or whatever the case may be and I'm wondering if you've ever relied on an app and then had it let you down, because sometimes apps don't get maintained or they get discontinued completely. Have you had any experiences like that?
Belinda Lane: Yes, it can only go so far. That's why we use so many different things. One of the other apps I use, is Seeing AI and it's similar to the AIRA that has the description, but it's more an OCR. When I want to know, "Okay, is this a can of beans or is it my peaches," or whatever, you can scan and it'll read you whatever the label is and it also has the ability to read the QR code if necessary.
Sometimes, it's terribly slow and you just want it now and then, sometimes it just won't do it at all, depending on what maybe the material is shiny or all kinds of things. Yes, we do encounter those drawbacks and one of my main pet peeves, and this is just, I don't think it has anything to do with accessibility or maybe it does, but gosh, when you make the hardware and you turn it on, make it do a beep or a sound, all it is, "Oh, is the light on? Well, I don't know," and especially cable companies, it's the worst. TVs are horrible. You have to lift up your Seeing AI and see is the thing on or off. But yeah, who wants to go next? George?
George: I actually am someone who doesn't have any vision at all and I do my own grocery shopping and I don't have it delivered to me. I actually go into the store to get the stuff myself, because when I'm shopping, I like to put my hands on it. I can get a written description, but to me it's still not the same when you put my hand on it. Now the two drawbacks that can occur, is I can go into a particular store and if I'm going to use AIRA or Be My Eyes, I may get poor cell reception and sometimes it breaks up and I lose that part of the service they're providing to me, because there may be either a lag, meaning they probably don't get the image to them for maybe five or six seconds down the line or the visual will be frozen.
And that can happen or if I'm using the Seeing AI app and using the product portion of the app, which means it'll scan the sticker and it'll tell me if I'm getting a pack of ground meat, how many pounds it is and what the ingredients are, if the sticker is old or has gotten wet or has been rubbed out, it'll tell me not recognized and now, if I'm at a particular facility that does not provide assisted shopper, then I have to either interrupt another customer who's shopping or just wait till I get to the checkout and ask the cashier does this say so-and-so? And depending on what store I'm going to and depending on how busy it is, they probably would go and get me exactly what I need or [inaudible] just to come back a second time. Those are some of the drawbacks of the apps that we use to enhance our independence, that could probably fail us, at an inopportune time.
Chris Miranda: Joanna.
Joanna Blackwell: Thank you. Yes, there are apps that are very useful or have been very useful and then they do go away. Maybe the developer just can't sustain the app anymore, because they're not getting paid or there's just some other reason the app just goes away and as Assistive Technology users, especially as Blind Assistive Technology users, we know we can't always count on our technology. We love our technology, but we do have to have other methods and I know a lot of people here on the panel, we have other means of identifying information.
Even if you're not a very good Braille reader, you can have Braille labels to figure, "What is this? What's in this can or what's in this box?" But when it comes to the apps, we may have our favorites, but for me, I'm always looking out for what's new, what's coming and that way, I have more than one option. I can either use Seeing AI or I can use many other apps. I can use AIRA if I'm doing work. It's important for us and we know, we wish it wasn't this way, but we always have to have more than one option even when it comes to screen readers.
And I'd like to emphasize, that I think the most important things for people who are developers, people who are creating these apps, is just to make sure you are making your apps accessible, because if you don't, then it's just another thing we have to find a workaround for and it seems like that's our whole life, finding workarounds for things that are inaccessible. If you can remove that barrier, maybe it's just a tiny barrier, like there's a button that's unlabeled or an image that doesn't have alt text. If you could just add those things and make our lives easier, we appreciate it so much, because it can be a struggle. We have alternative methods, but sometimes life is a headache, when you have to always find a workaround.
Ernesto Sifuent: This is Ernesto. Some of those workarounds sometimes though, are interesting. I'll use Uber or something like that, to grab a ride and sometimes it's a mixture of technology, because it's just not Uber. It's also cellular companies that provide the bandwidth or limit the bandwidth. In some instances, I can get stuck somewhere until... You can try to problem solve, "Well maybe it's my phone, I need to reboot. No, that's not that. It says I've got two bars, why am I not getting through? Is it really them? But the other services are working?"
Sometimes, you have to move over to another service or it seems sometimes, some specific things just don't work, because something is being blocked or something and sometimes you just wait, there's no workaround. On another note, when we have applications, like Belinda was saying, we have Seeing AI, I believe you mentioned. Exactly what Joanna said, we're always looking out for other applications such as Apple. Apple, their phone, when you take a photo now, you have voiceover running, if you keep your finger and tap on it, let it keep reading, it will describe the image and describe text that's on there.
Usually I take a lot of photos of documents and I have a library of documents on my phone. It's a usual, I guess photo album now, but if I'm looking at something, maybe for work, sometimes some things are a little weird. For a while the color analyzer wasn't working with magnification. The US version now works with that. It's good. Actually, they released a second upgrade after the first one, the one that was working. You find that things get better or start working, which is great and what I mean with Color Analyzer, gosh, shaking my head. Anyways, that it'll work with the magnification, as it's got its own little magnification system and it's two things moving. It's interesting and fun. It's a game, to always learn something. That's all I've got.
Chris Miranda: We have a question in the audience.
Speaker 2: We're going to take one online first, if that's okay. Will you unmute Carolyn?
Carolyn: Yeah, I'm not sure which question to prioritize, because I have lots, but thank you so much for being here, sharing your experiences. I guess I'll ask my questions on alt text and screen readers, since it seems like a number of you use that technology. I often write book titles and I'm wondering, this is a multipart question. Part one, does capitalization in a book title make any difference to the screen reader, if each word is capitalized versus just the first one? Same with if there's quotation marks around the book title, does that make a difference? My second alt text question is, if you're writing a time signature, for example, this event is 6:15 to 8:15, your text reads, because you're doing alt text for a poster. What's the best way to write those numbers and the dash between them and the hyphens, so that they're read correctly? Yeah, those are two to start with. Thanks.
Chris Miranda: Joanna.
Joanna Blackwell: To talk about your first question about capitalization and quotation marks, it usually doesn't make a difference. The quotation marks may or may not read, depending on the settings that the user has set for their screen reader. Normally on what we call the beginner level, it will read those, but a lot of people have changed that. If the quotation marks are important, you'll want to include them, because as screen reader users, we can read character by character, word by word.
Now something that is very important to note, is if there's a hashtag, if all the words are not capitalized, most screen readers will just read it as gobbledy gook, because it doesn't know where one word ends and the other begins. For hashtags, I think capitalization is important. That's one thing to keep in mind, but as for the titles and the other part of your question, it shouldn't matter if you capitalize or don't capitalize. Of course, if the book title's actually capitalized, the letters are capitalized, I'm not quite sure why you wouldn't capitalize it. You want it to represent whatever is actually shown in the printed text. That would be important and the other question was about, I'm forgetting the other question, could you repeat your.
Ernesto Sifuent: Numbers
Carolyn: Specifically, timestamp for an event.
Joanna Blackwell: Yeah, you should usually be able to write the times. If it's let's say 6:00 PM to 8:00 PM, I usually just type out 6:00-8:00 PM and then if there's central time or eastern time, whatever, just put that as well and that should read just fine. Instead of a dash, you could write to, 6:00 to 8:00 PM for example and that works for me. Anyone else want to add?
Ernesto Sifuent: Yes, this is Ernesto. Periods, commas do make a difference. There is a slight pause. It does change the way it says things. As for dates, I'd rather have an actual March 2nd rather than a number, something that you can hear audibly well. That's my [inaudible].
Belinda Lane: Oh, George has one.
George: Just one. A lot of, not only us but even sighted people [inaudible] are sending a text message on a phone. Sometimes they may type or they may dictate and usually when I receive text message, I can tell if someone has dictated whether they're going to speak punctuations or they're just talking, because when they're just talking, it makes it one big long run on sentence, that the screen reader, the voiceover is just reading.
But somebody like me, who's mindful of that, because I was a journalist major, even if I'm going to dictate, if I choose to dictate, I actually will speak the punctuation, so that it will look like real sentences and if I happen to put timestamps in, like meet me for lunch at 12:30 PM, I will have to go back and probably put 1, 2, put a colon between the two and the three, so it will say 12:30, not 1,230. That's where the colon comes into play, that if you're writing something, if anybody's going to be using a screen reader to read that, those are important for it to read out loud, so they would enunciate it correctly.
Belinda Lane: Yes, I agree with that and one rule of thumb, I think for all text, if you're describing something, is keep it simple. You don't need a whole lot, but it needs to capture what you're trying to describe and as far as formatting of numbers and such, using the appropriate formats, for instance phone numbers, it's irritating to see area code 512 dot 231 dot 0867. The hyphens sound better when your screener reader reads it. It reads it as a phone number rather than an IP address.
Also in the example of the book title, like Joanna said, just write whatever... Well if it's already there, you're reading it. The capitals, unless it's just something, I don't really care what the font is, I know that the title Gone With The Wind, I don't need to know that the word wind looks like a gust of wind. Well maybe some people like that, I don't know. But keeping it simple and like George said, use the punctuation. If you are describing something, make sure you do put a period in there, so that there is that pause, things like that. [inaudible].
Andrew De Avila: Yeah, and to touch on what everyone has said also, also using your tools in your ribbon are actually very helpful. Inserting those date and timestamps, the numbering. I know me going back to school, reading long text, I know labeling your heading level ones, twos and threes, that's actually very helpful with someone using screen reading software. We actually have control over what we want to read, using say Job Access With Speech, JAWS.
We could have it read some, most or none, on punctuation, but we have control over how we want to customize that reading, but making sure that you do it though on the front end, is going to be very helpful, those periods, those commas, but if you're wondering well, is this going to be accessible, a lot of times just using the tools in your ribbon, inserting those numbers, that way the dates, the times, majority of the time the screen reading software will read that in Microsoft Word and Kevin, I don't know if you wanted to touch on anything.
Chris Miranda: Thank you guys. Belinda did you have something else? Okay, I just wanted to add, for the alt text and we'll get to you next. Okay. From my experience, I've seen that in the alt text, I've seen people actually put instructions in there and if this is still true you guys, but the screen reader reads the alt text, all at once. There's no reading word by word, so you can't stop it to pause and go, "Okay," and then the next. You don't want to do that.
You want to put a succinct description and you can put a lot, you can make the description really describe the photo and you can make it shorter, by not adding this as an image of just write what it is, but just remember that the screen reader will read it all at once. Can we... I don't know if we... Yeah.
Maddie: Thank you so much for all your great answers today. My name is Maddie, I'm a cisgendered white woman with brown hair and glasses. I am asking this question from the position of hoping to get into auditing websites and apps. I'm very curious to know which industries are creating really accessible and usable websites and apps and which aren't. Yeah, I'd love to hear which industry to go into, to start auditing websites for usability. Thank you.
Chris Miranda: Joanna.
Joanna Blackwell: That's a hard question to answer, which industry is doing a good job of making accessible websites? It runs the gamut across all industries. Some companies are good and some companies are not good and it really doesn't matter what industry it's in. I've seen good banking websites and terrible banking websites, good shopping sites and terrible shopping sites and everything along that line from good to very bad. I don't know that I can answer that question. Do we have anyone else?
Chris Miranda: Oh, Kevin.
Ernesto Sifuent: This is Ernesto. Maybe a couple that I've just seen them do a really good job, actually HEB, which is local to Austin and most of Texas, I guess lower Texas and stuff. Their app is amazing. They actually do a pretty good job. I don't like shopping, I don't like going to the grocery store, because it's really difficult to find a can or something. I've got to go by color. I don't go by words. I've got to look at this general symbol or the pattern it'll build over and over. That's what I'm looking for when I'm shopping and having an app like that, is just great. They do a really simple, straightforward job of ordering food. Banking, that's the gamut. Even Amazon, Amazon's done some terrible things here and there, but then they fix it. They're always messing around with the website. Audible, I always like what Audible's doing or from the past and they've done the same. Of course, Amazon now owns that. Yeah, does anybody have some [inaudible]. Here you go.
Chris Miranda: Oh, Kevin.
Kevin Ratliff: [inaudible]. Like Joanna said, it spans the gamut. I do a lot of usability testing and the industries that I've noticed that tend to pay a little more attention, is banking, insurance. Most of what we would consider, I guess major banks, tend to at least give a nod toward accessibility. They probably have some user testers. I've used Chase and Bank of America, I've used Wells Fargo, with no problem at all.
Each might have a different approach, but they are trying to make the app user-friendly and accessible. Which sometimes the website might be user accessible, but not so user-friendly. Like on Amazon, it's just too huge, but then other websites or other companies will have an alternative site. Sometimes you go to the mobile version of the site or Amazon has an accessible version of their site and then they might also have something like disability support, where you can just call in, if you're not able to manage it with your screen reader.
But I've done a lot of testing for insurance sites, large and small. Shopping, HEB, Kroger, different things like that and they have lots of other things that are helpful to the mainstream as well, like preparation instructions and things like that. It does span the gamut, but the insurance, the banking, medical, I've seen more of those websites, those companies, try to make sure that their site is both accessible and user-friendly. They care about the usability as well.
Belinda Lane: This is Belinda and I agree. The banking industry has some very good ones, more better than the others. All the credit card companies have an app and their websites are really good. Discover, American Express, I have them all on my phone, you can do everything with the phone now and it's very accessible. The medical, which Kevin just touched on, a lot of those are really good too.
Nowadays, "Here, go to our patient portal," everything you have to do online now and luckily, they are making them pretty accessible and that's really neat, to be able to go into one of your patient portal sites and independently and confidentially, look at your own information, test results, if you want to look at the test results or your Doctor may comment on things. That is a really neat advancement in that industry, I think. Now, there are shortcomings.
Sometimes, the whole thing will be perfectly accessible until you get to one spot and I have to find one of my sons or somebody who can see, "Hey, can you make this move on," because now I'm stuck and it's a visual thing, especially those, what are they called? The Captchas? Those are a nightmare and they say, "Oh here, use an audible one instead," and then well what part of the audible do I use? Do I use the Garbly stuff?
Where does it end? And I'm thinking maybe if you just put anything in there, it takes it. I don't know, maybe I've cracked the code. Yeah, because if it says... The number ones are hard, because I have to write them down 7, 6, 3 and 4, 5 and sometimes I might leave off one and it still works. I don't know, but just sometimes the whole thing isn't quite there.
The entertainment industry has come a long way. The apps, Disney Plus is fabulous, Netflix. Peacock is a nightmare. The HBO Plus, another nightmare, but what was the other one? And the reason they're nightmares, is because you have to swipe a hundred billion times, to get anywhere and like Andrew or Kevin said, the headings really help, even in the apps. If you can jump from heading to heading, that really, really helps the experience.
Chris Miranda: Do we have any questions from the...
Speaker 2: We do. Let's ask Jane to unmute.
Jane: Thank you very much. First of all, I'd like to point out that there are a ton of wonderful questions in the meeting chat and I hope we'll get a chance to get to some of those. The question I want to throw out, we've been talking a lot about ChatGPT and one of the things, I was doing a presentation last week and I was trying a number of different alt text generation strategies using variations of ChatGPT and I found that for abstract things like pictures, either it was so vague as to be unusable or it was just plain wrong and with a lot of promotion of ChatGPT as an alt text solution, I was wondering what your experiences and thoughts were on that.
Chris Miranda: Joanna.
Joanna Blackwell: Yes, we have had experience with image description before ChatGPT, like Ernesto was saying with Apple and how it will describe photos to you, but it makes mistakes. Ever since President Trump was in office and then Biden became president, it kept identifying Biden as Trump. It said President Trump standing in a podium and of course, that's not right. We know there are problems, there are issues.
As with anything, for people who are blind, we know sometimes we have to double check. If something seems a little off, we know we may not be able to trust it. But that being said, it does often do a good job. Just imagine somebody put a blindfold on you and they handed you a picture and say, "Hey, here's this picture and it's got a tree in it," but then imagine you could put it in your iPhone and then voiceover would say, "This is a pine tree with a mountain in the background."
That's amazing. Even that incremental increase in the amount of detail we get, is really incredible. I think as blind people, I think we've all been taught and we know, that we can't trust technology all the way, but I think the way it's going in the future, the smarter the AI gets, the more we will be able to trust it. But there's probably always going to be a need for human intervention. Personally, I think it would be fine to use something like a ChatGPT to make a description, but then you definitely want to check it with your human knowledge and see, "Is this correct or not?" Now I would like to say we do like alt text to be simple, but this is my personal preference. I love as much detail as I can get about an image, because it's really a personal thing.
But for me, even though I'm blind, I'm a visual learner. The more detail you can give me about how something looks visually, the happier I am. That's just my personal preference and obviously, not all AT users are the same and really it would be nice if there were options like, "Here's the simple description, here's more detail," and this brings to my mind and it's related to ChatGPT, there's an app out there called Be My Eyes, and they're actually working on something called a virtual volunteer and they're going to be using ChatGPT, where you can take a picture of let's say a boxed dinner or something like that and it will tell you what that boxed dinner is.
You can ask it questions like, "What's the nutritional info, how do I cook this?" And it will answer from the pictures that you took of that box, which is super incredible. Personally, I'm very excited about these options that are going to be available to us soon, but as always, we have to be careful not to put all of our trust in technology. I know that was a very long answer. Anyone else want to take this one?
Ernesto Sifuent: This is Ernesto, just real quick. As these newer models get better at describing, you're going to find some that are going to work a lot better. I think some of the things that are coming up that are really interesting is, I know this is more about alt text, but if you have a picture and let's say you want to investigate the picture a little bit more. It'll give you a general idea of what the picture is about, is that you actually interact with the picture, "Oh, what's in the background? Oh, so there's trees. What kind of trees are those?"
And it starts identifying a little bit more information and that's what's really interesting about some of this technology, is that you can get a general idea of what's going on and then focus in on the maybe, "Oh, there's maybe a table off to the right and people are eating," or something like that and then it starts describing some of the people. Now that's really interesting, I think for a blind user, because you realize there's another world going on, that you've captured somehow in that photo.
Kevin Ratliff: This is Kevin. I have had a lot of luck with the image descriptions built into Chrome and to Edge and being able to check them against a couple of other sources or maybe someone sighted. Obviously, I can't upload every picture that I have to that service, but using that has been... I haven't had a lot of experience with the ChatGPT. I've used the Google Bard for some things, but so much for the image descriptions.
But another thing that has been impressive, is if I double tap and hold and go to details on an image on Facebook, I can go to more details and it will tell me where things are laid, if there's a sun in the right-hand corner or if there's balloons ahead, so you can get more than the description that it generates automatically. Those are some of the other ones and then if you pull a picture into Word or PowerPoint or whatever, it can try to do an automatic image description as well. We've had some services before the ChatGPT options came along.
Chris Miranda: I understand we have a lot of questions from... You have a question? But we also... Okay.
Paige: Hi, my name is Paige. I was curious in your educational experiences, either as a student yourself or if you have students, are there any Assistive Technology advancements that exist now, that have been especially impactful in a classroom or school setting or would've been especially impactful to your personal experience?
Chris Miranda: Anybody want to take that one? Can you repeat the question?
Paige: Sorry. In your experience, either as a student or maybe a teacher of students, are there any Assistive Technology advancements that exist now, that have been especially impactful in a classroom or school or learning setting, either currently or that would've benefited you in your earlier experience as a student?
Ernesto Sifuent: It's interesting, at least at college years, this was when technology was still ramping up, but let's say if I was attending school still. Going to school is still challenging and a lot of the things I find, or when I used to teach, is that some of these kids would come back with little knowledge of how to really use your screen reader. That was one thing. Another thing was just inaccessible content.
When you mix both, sometimes there's two things going on and unfortunately, that's our world sometimes. But yeah, screen readers have been just incredibly helpful as we can all say. But as for new technology right now, I would say ChatGPT is really interesting to play with and now if they could just fix their application a little, that'd be great, but sorry about that. Yeah, new technology coming up, ChatGPT are just really amazing. Does Anybody else?
Joanna Blackwell: I do.
Ernesto Sifuent: Go ahead
Chris Miranda: Joe. I was just going to say too, maybe we'll reduce it down to two answers, so that we can get to all the other questions.
Joanna Blackwell: I'm not in college right now, it's been a long time, but I would like to add that honestly, I'm not sure that it's making Assistive Technology better, although there is that.
What is most important, is making sure your content is accessible, because to put the onus on the Assistive Technology to do the work that should be done by professors, by the universities, by the colleges, by the K-12 schools, that's what's really important, because we can have the most amazing Assistive Technology, but if your content's not accessible, it doesn't do us any good.
I think this is one thing that's really important and I appreciate the question about the Assistive Technology, but sometimes, and this may just be from lack of experience from Developers, Content Creators, Teachers, Professors, but they think that just because you have this amazing Assistive Technology, that obviously you can just do the assignment even if it's an inaccessible assignment. That's just something to keep in mind.
Chris Miranda: Okay, and you said we had...
Speaker 2: Yeah, Amber asked, what are the most frustrating experiences when navigating a website or app, especially with something like insurance, healthcare, or other information heavy products?
Ernesto Sifuent: I've got a story about that. Yeah, Captcha. Five months to fix a Captcha and this is house payments, just trying to make your payment. You've got to go through this sideway portal or whatever, instead of going and creating an account, which you cannot create an account, because you can't just call in, you have to go through their portal thing that has a Captcha, that you can't access. Five months, finally fixed. Maybe I'll just leave it at that.
Chris Miranda: Okay and we're going to go ahead and go through the questions.
Speaker 2: Okay. I'm going to unmute Carolyn again.
Carolyn: I just have another question. Sorry, I have non-alt text questions, but just especially with a whole panel here, as screen reader users, what's your preference for descriptions of race and specifically skin tone, when race is unknown in alt text? Because, I've seen, Cooper Hewitt says you should say light, medium, dark skinned, if race is unknown to de-center whiteness. But then, I've also heard folks who don't like hearing that. Just as users, what is your preference? If race is unknown. Obviously if it's known, then you state it.
Chris Miranda: Oh, Joe.
Joanna Blackwell: I personally like light, medium and dark skin tone. I think that's actually good, even if you know this "race" of a person, because people may identify as a certain race or ethnicity, but that doesn't mean that their skin tone matches whatever people think. It's all very stereotypical, is what I'm trying to say. I think I like to know about people, I'm curious about visual details. I think it's fine to describe, but I don't think that we need to go on and on about it. Light, medium, dark is good for me. Anyone else?
Ernesto Sifuent: It's interesting when people describe themselves too. I personally don't like doing that. Actually, I think it's really more about the person than the actual what is going on. It's not information I really need. Yeah. That's my take.
Belinda Lane: Yeah, I think it's all individual, but I'm with Joanna. Just a little description helps. If it's in an alt text, pictures, I think the light, medium, dark is good enough. In the iPhone, in your little icons, it does that. I wish it would just let you do the silly line drawing or whatever. I don't want to have to waste my time picking in there, "Which one is it," but yeah, for your alt text, I think yeah, that's good and you're going to get everybody the gambit, those that appreciate a description and those who don't really need a description.
Chris Miranda: Any question from the live audience or...
Speaker 16: Oh, I have a related comment and it's about gender expression. For me, what I've noticed lately is the audio description, oftentimes it stops saying a woman, a man. It would just say a person, which is fine, because we don't know how people identify, but I would like to know what they're expressing with their appearance, because is this person expressing it as a woman or a man, or maybe non-binary? That would be nice. I honestly would find that helpful, but this is just my opinion.
Chris Miranda: George. George has a...
George: It does depend on the setting, like loud related to movies. If I'm listening to the audio describer and I will admit, depends on the particular actress. If her voice sounds appealing, yeah, I want to know everything. I found myself sometimes Googling, if I know the actress's name, "What's the height, weight of so-and-so?" Because, I want the real picture. I want to be on that playing field like you guys. Y'all have the ability to look at the screen.
You can tell if this person is very curvaceous or this person is very slender. A good example, I watched a Netflix series called The Night Agent, and throughout all of these episodes, one of the [inaudible] people, that kept describing the lady as the hazel eyed female and that's all they said. I don't know if the hazel eyed female was somebody who was Mexican or somebody who was Caucasian.
I don't know. I don't have the ability to know if hazel eyes is limited to a particular race or not and since they never spoke the name in all 10 episodes, we don't know what the person's name is, so I can say who played so-and-so in that role, because all the audio description says, is the hazel eyed female. I said a long lot of stuff, just to say it depends on the setting.
Chris Miranda: Well said.
Melissa Green: All right. Hate to cut things off there, because our panelists, I don't know about you. This is the second time I've been through this panel and I've learned more things each time. I think it's just been a really rich discussion with our AT users sharing their needs and preferences around at. Let me switch us back over to our slide deck. We've left a little time at the end of our time together, for questions. We've been doing some questions and comments through the chat. I messaged Julie Anne a few minutes ago and told her I felt like I was hosting pop-up video, if any of you recall that show, sharing content and information throughout the session, but what questions do you have? What comments or thoughts would you like to share, at this point? You're welcome to do that in the chat or raise your hand and I will call on you. Well as questions and thoughts come to mind, here's Jock. Jock, would you like to unmute your mic and ask?
Jock: Yes, I sat in on a Knowbility webinar this morning and Nick Steinhouse was there with two other people and I've got great respect for him. I'll just read what I wrote to him. Again, it has to do with alt text. Nick writes... Let me see. John Peterson said, "Should alt text be short, so as not to go past a character limit? Or for instance, if an image has a lot of text on it, should alt text include the text, regardless of the character count?"
And then Nick wrote back, "You can put more text in alt text than becomes useful." In other words, you can overdo it, he says, "In part because navigating long chunks of content, is difficult to navigate with a screen reader," which is also true. It's often recommended to put longer text descriptions outside of the alt text and my question was, "Nick, please explain where does this longer description go?" And then he answers, "There are many ways to handle this and it depends a lot on your context." If you guys could just help me, what does that mean? There are many places it can go. It all depends on your context. Where do I put it, if it's long, long?
Melissa Green: Yeah. I shared earlier in the chat, it's buried now, I'm going to share it again. A lot of this work that we do around digital accessibility, is informed by the Web Content Accessibility Guidelines, WCAG and one of the resources that the W3C WAI provides, is a collection of techniques for achieving conformance with a guideline. All this to say, I've provided a link to a page that has a bunch of different ways that you can provide access to alt text, outside of the alt attribute of the image element.
It can be as simple as just providing the description in the context of the page or creating a standalone page that has the description on it and linking to it or it could become much more complicated. Again, like Nick said, depending on your technologies that you're using, what you're trying to accomplish, but I think that, that list is a good start for getting some ideas about and inspiration for, different approaches to providing image description.
Jock: Okay, I'll copy that. That's good. Thank you.
Melissa Green: Yeah, sure. I see another comment in the chat. I would like to expand on the gender issue. It makes me uncomfortable to write young man or young woman, when it's necessary to include in alt text. Are there any rules I'm not aware of? I sometimes feel uncomfortable with this as well, just because I have to figure out what's relevant. If I am the Content Creator, if I am designing a presentation and I choose to put an image of it, I know why I selected an image of, let's say a group of students.
Let's say I put the image there, to show the student recreation center, that was recently renovated. I would probably just describe it as students using recreation equipment in the newly renovated center, as opposed to a young man or a young woman, because in that case, I think what's most relevant, is that they're students, not necessarily what their gender is. I might describe someone's age or gender, if I felt like that was relevant to... If that was the function of the image, if that was the purpose of including it.
Oh, these are women, who were working in factories during World War II, for example. That would probably be relevant in historical context, that there was a cultural change during that time and more women moved into the workforce. It's something that makes me uncomfortable as well, but a good guidance for me when describing images, is just what's the function of this? What's the purpose of it? What do I want people to know or do, as a result of seeing this image? And that usually informs my description. Sometimes that might include personal characteristics like age, gender, race, if I know those things and other times it might not. I know that's one of those unsatisfying, it depends answers, but it really does depend.
We are just about at the end of our time. We were together for a long time today. I'd like to thank you so much for joining us and choosing to celebrate global Accessibility Awareness Day with Knowbility. Coming up next month, we will continue this journey into How People with Disabilities Access Digital Content. Part two will happen on Thursday, June 15th at 5:00 central. You can find more information about that on the Knowbility website. Thanks again and we look forward to seeing you next month. Have a great night.