4

Talking Drupal #371 - WebOps Enabled Accessibility

 1 year ago
source link: https://www.talkingdrupal.com/371
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
neoserver,ios ssh client

Transcript: 

Nic  0:06  
This is talking Drupal weekly chat about web design development from a group of people with one thing in common. We love Drupal. This is episode 371. webops enabled accessibility. welcome to Talking Drupal. Today we're talking about webops enabled accessibility with Tearyne Almendariz and Blake Bertuccelli. Tearyne has been the guest host for the past four weeks, and today she's with us as a guest as well. She's the developer advocate at Pantheon and lead of the Drupal Diversity and Inclusion Initiative. She's also a certified professional accessibility core competencies through the International Association of accessibility professionals. Thank you for joining us for the last four weeks.

Tearyne  0:48  
Thanks for having me.

Nic  0:49  
It's been fun. Fun. Blake is the founder of Equalify, a new open source accessibility platform that's working to provide more affordable accessibility testing. And I hear that congratulations are in order as you just got married a couple days ago. Congratulations.

Blake  1:04  
Yeah, I went from bad camp to my marriage. So it was a fun filled week.

Nic  1:10  
Sounds exciting. Welcome to the show. And thank you for joining us. Thank you. I'm Nic Laflin, founder of Enlightened Development. And today my co hosts are well, usually here's where I say as usual, but it's not as usual. It's a nice blast from the past. Steven Cross is a guest hosted me and in for John today. If you listen to the first 300 episodes of tracking Drupal, you definitely know him. And if you're new to talk in Drupal world, go back and listen to all the old episodes. But he was a former co host, and now works behind the scenes. Welcome back to the show.

Stephen  1:45  
Hey, super happy to be here.

Nic  1:48  
We do have a quick update this week in New England. DrupalCamp is November 18. Through 19th. in Providence, Rhode Island, the 18th is training I believe in the 19th is the session day. So definitely if you're in the New England area or able to travel, definitely check us out. Looking forward to there's a bunch of really good sessions coming up and definitely looking forward to seeing everybody again.

Stephen  2:14  
So this is the module that week we have something new here talking Drupal. We have a I think we're calling it a correspondent, a module the week correspondent, so a frequent guest and contributed to the show. Martin Anderson-Klutz is here to join us this week, and every week for the for the near future. And he's going to talk about the moldule of the week Martin is an Acquia triple certified Drupal expert, and the maintainer of a number of Drupal modules, in addition to be a Senior Solutions Engineer at Acquia, pre sales team. So, Martin, welcome and what is our module of the week?

Martin  2:54  
Thanks, Steve. And actually, before I jump in there want to mention that if anyone has a module that they want to nominate for Module of the week, please tag me a min clue in Drupal slack and put that into the talking Drupal channel. So the module that we're going to talk about this week is called entity comparison. Essentially, what it does is it generates a configurable comparison table. For two or more Drupal entities. Anybody who's been to a number of E commerce websites, or even just Product Catalog sites will have seen this kind of functionality, where you, you know, on individual products can can click to add them to a comparison, and then click to actually view that comparison. I've noticed it's actually not dependent on commerce, but it's something that you should be able to use with commerce if you need to. In terms of its history, the module was originally released in 2017. But the current 4.0 release was created in August of 2022. And it's already ready for Drupal 10. In addition to having security coverage, it's not a super widely used currently, it's used by 292 websites. But the issue queue for the the module is pretty clean, there's three open issues, and two of them have patches that need review. And talking a little more generally about how the module works, it actually is pretty simple to set up. I tried it out for the first time three or four weeks ago. And it took me probably less than a half day to really get it set up and working, including customizing which which attributes would be used for the comparison. It basically generates comparison configuration entities. And then within those, you'll specify which entity type and which bundle will be used. And then that actually automatically will create a view mode for that selected bundle. And that allows you through the Drupal UI to specify which attributes are going to be displayed in which order and using which formatters and then it also provides a couple of other nice features There's custom blocks that you can use. So either to like place on your product page to add to that comparison feature, as well as another block to actually provide the link out to the actual comparison itself. The other thing that it does is it actually provides a field of its own. So if you have sort of your standard product page, and you don't want to use that block, you can actually display that as a standard field through the field UI as well. So maybe for the for the rest of the hosts here, Has anybody used either any entity comparison or a different solution for generating this kind of product comparison functionality?

Nic  5:40  
what it sounds like, the key feature of this is it allows the end user to compare two entities, right? It's not that you're just building a table to compare a couple of things. It's so that the end user can be like, I'm interested in this one. And this other one, what's the difference? That key features? Right?

Martin  5:55  
Yeah, very good point. It's, it's Yeah, yeah, as you say, it's not like on some Amazon product pages, where you'll see like a pre generated comparison of all the products in their line, as you say, more for the visitor to be able to pick which products they want to compare.

Nic  6:09  
I have not used this, I've built that feature a few times manually. So I'm sure I have a couple of clients that I'm definitely going to take a look at this for and see if it's worth moving all the custom code into into the system, because it's, it's fairly hefty to, to manage that yourself,

Martin  6:30  
you can definitely think of a couple of products or projects that I worked on, where we talked a customer out of this kind of functionality, because we thought it would would be too expensive for them to implement.

Stephen  6:42  
Yeah, I was gonna ask you, Martin, what was the use case that you looked at this module, you had something in mind.

Martin  6:48  
So I was talking to a customer who is thinking about going to Drupal for their CMS. And one of the the capabilities that they really wanted to see was this ability to do a product comparison. And so did a little searching around and and found this, and I thought it was great that it's obviously you could use it on a commerce site. But the fact that you can use it under a variety of different use cases, I thought was pretty interesting. So as an example, you could do something like let's say, a job search site where the the person who posted a listing could actually do a quick comparison of different people who had applied to that job, or conversely, a job seeker could could do like a comparison of different jobs that they're interested in.

Nic  7:29  
Interesting.

Tearyne  7:30  
Yeah. This makes me think of the healthcare.gov page, right? When you're going in. For context, if you haven't used it, when you sign up for healthcare.gov, you can go in and there are tons and tons of plans, depending on context, right? That you can view, but sometimes you want to see like side by side deductible for this one versus deductible for that one. And having to build that kind of thing out for so many different plans can be very cumbersome, but allowing the end user to control that, you know, only you as an individual know what your healthcare needs are. So I can see a lot of utility there.

Nic  8:07  
Can you expose some of the options to the end user to so rather than just being like, I want to compare these two things? Can Can you give them the ability to be like I want to compare the price or the description? Or is it just you, as you just say, if you're comparing these two entities, these are the fields you get. And that's that.

Martin  8:27  
So the the comparisons are sort of predefined. But theoretically, you could have more than one comparison for the same bundle. So you could say, like, compare based on price or compare based on features as like two different options and expose those.

Stephen  8:43  
Martin, Have you Have you determined how flexible the user interface is to make modifications to that kind of UI for the end user.

Martin  8:53  
So full confession, I haven't used this a ton myself. But based on that kind of initial setup and try it out, it seems, again, because it leverages sort of the field UI that's very flexible within Drupal, you can even create new formatters completely custom if you need to. It really gives you a lot of flexibility as a site builder.

Tearyne  9:16  
Now I want to build something just to experiment.

Nic  9:21  
Well, thank you for joining us, Martin for this new segment it it definitely is an interesting module. Like Tearyne said, I have a couple use cases in mind for so thank you for joining us and appreciate your input.

Martin  9:34  
Thanks for having me.

Stephen  9:36  
We'll see you next week.

Nic  9:38  
Definitely. Okay, moving on to our primary topic of the week. The title of the show this week is web ops enabled accessibility. So let's let's start with what is web web ops. I mean, I've heard of DevOps. What is WebOps?

Tearyne  9:53  
WebOps is a set of practices and workflows that facilitate collaboration. They encourage automation of operations. improve the productivity of web teams. For DevOps, we're all familiar with that, where it's bringing developers and the system operators together. With web ops, you're focusing on how you're bringing together your developers, marketing systems, engineers, marketing operations, designers, content creators, and other folks, and\\ marketing into a cross functional team. I think that one thing that's really important is that a lot of the folks that are listed as like the actors and web ops teams that are different, there's some of the key owners and have direct impact on your accessibility items. Because they're the ones that are forming the shape of your content, your designers, people that are sending out your marketing ops, like your emails that people get, and then especially your content creators, that you have to be like, No, you can't use like size 11 font for this email, please put all text on things, it's required, we might get sued.

Blake  10:53  
So and one of the things that that we found is that a lot of accessibility for people who are managing accessibility are not developers, they are end users. And, you know, they're not even familiar with anything beyond like a WYSIWYG editor. So when they're asked to manage big accessibility issues that are being reported to them, they don't have an easy user interface to both manage the tests that are going into these accessibility requirements, and also the remediation of actually fixing the errors. So, you know, that's one of the reasons that that we're using the word webops a lot, because we want to work with people who aren't just developers, there's a lot of great development tools for accessibility testing, and remediation. But there's not a lot of easy to use, and most importantly, affordable user interface. components that allow, you know, the end user, the editors, the marketing managers, the accessibility managers to really get in and manage their own accessibility.

Nic  12:03  
We now know a web Ops is it's pulling in the full team from content to developers, rather than focusing just on the developer side, right. But what is the web ops enabled accessibility? So it sounds like it's a little bit about allowing the non technical folks insight into it? Is it beyond that? Or is it really just what you were saying like the

Blake  12:25  
I mean, what we what we look at a lot is also the tools that people are using. So when people are using web services, not as a CI component or a CD component, but as a literal interface that they log into, they use and they manipulate, that's when it starts to go into the webops world a little bit more. And so it's not just like we're talking about the stakeholders, it's also the tools or the stakeholders we're using, and, you know, great tools like pantheons. system, you know, how does that we're answering the question of like, how do web services like that integrate with other web services to provide better accessibility. And we think that it's, it's not just about doing one thing, it's not just about updating one piece of the code base, you really have to think about the entire network of infrastructure that you're using to run a website, when you're talking about accessibility, we see a lot of the accessibility errors, you know, from a user created error to a server created error. And so in order to maintain and manage those errors in a way that is holistic, and and kind of speaks to the reality of where errors are coming from, you have to think about all the different users that are using and maintaining website and all the different types of technology that's going into to create these websites.

Stephen  14:02  
So you've you said that you should be thinking about all of these different users, why should you be thinking about them?

Blake  14:13  
The big reason is because the people with the budgets are giving money to people who are non technical, to manage these, these accessibility problems. And just kind of a quick history on why accessibility is important. Right now, it starts with giving people with disabilities access to information that we all know and love. That is the real reason why we're doing this work and why it's so important. The second tier is a government mandated piece. So the US, for instance, and governments all over the world are doing the same thing. If you receive money from the US through like a grant so if your education institute Michigan, and you receive money through the US Department of Education, you have to maintain website accessibility standards, sort of 508 standards, and you are legally mandated to maintain those standards. It's just like having a wheelchair ramp into your place of business, you have to have these basic accessibility requirements met, or you're gonna get sued. So what we're seeing right now are a lot of higher ed institutions specifically. And that's that's who we're working with, mostly with a qualify, but a lot of higher ed institutions are getting these scary letters from the US Department of Justice. And they're freaking out and hiring law firms or hiring big, really expensive companies who say that they're going to do a bunch of remediation. And then they also start departments in their own university. And the people starting departments, we see us again, and again, are folks that are non technical, they might be these have subject expertise, but they're non technical. And we think that these non technical users should be enabled to remediate accessibility, just like the great developers that know how to run the CI and CD tools that are out there today.

Nic  16:11  
And so traditionally, what kind of what kind of tools are used for that use, you're automating that on the DevOps side.

Blake  16:18  
And the DevOps side, there's some great accessibility, automated accessibility scanning tools. So x by dQ is a really cool one. You know, there's Google lighthouse, so a lot of people use and the browsers, there's wave, which we personally have integrated with and love to use, they have an API that you can basically run any kind of URL against. And then there's other tools, like, little forest is another good one. So those are the automated accessibility scanners, these are going to find about from 13 to 40%, of your accessibility issues. So they're not going to find all the accessibility issues. But if you use like Google lighthouse, and it scans through your website, it'll find around Google Lighthouse will probably find around 30% of the compliance issues you have. The second tier is when you start to need more specialized experience. And that's literally manual testing. And so the WCAG 2.1 guidelines have manual tests and different guidelines to maintain accessibility. Now, that type of manual testing, which is required is going to need some level of subjects expertise. So you're going to have to be aware and know about these manual tests. And you're reading the you know, just like any kind of HTML guidelines, you've read through the guidelines, try to memorize it and do the best you can. I remember, I was working with the Department of somebody from the Department of Justice, on how they test it. And literally, it was one person who did not use any of the automated tests. And she just really knew all the guidelines. And so she went through and click the tab and saw where where the mouse went, she had a screen reader that, that read out to her. And she went through the guidelines and tests to decide very manually. And I do hope that with some of this website, web apps conversation and some deeper integrations with Drupal, we can automate or at least make that experience of manual testing and remediation a little bit easier.

Stephen  18:34  
To be honest, I mean, to be honest, that manual person is is better than what we see in a lot of places, right? At least it's someone who's dedicated to, to figuring that out.

Tearyne  18:47  
Yeah, one thing I'd like to add in there is that there are certification tests for it, like I mentioned in my intro, m and c fac certified practitioner, right. But the 10 it, the test is very intensive, it covers a lot. I had to take the CPAC certification twice, just because there's so much content that's covered in it, both for the US and for all the other sites that you have to or the other laws that you have to comply with overseas as well, right, because our sites are not just seen in the United States. Blake mentioned earlier about the manual code review with us learning all of these criteria that exists at different principles, when even when you're doing your manual code review. For your mcag 2.0 standards. 80% of those standards have to be tested manually. And now the 2.1 success criteria 100% of those have to be tested manually. Yeah, like considering how few people are actually certified for being able to do this manual testing and how much manual testing there is even once you're certified and have so much of this memorize as much stuff As you can automate, you're gonna save money for your company, your entities that are your clients, and you're gonna save time for your teams. And that means that they can focus on these bigger stickier, hairier issues.

Stephen  20:11  
Are use suggesting that the automated process that we're talking about today replaces a certified person

Tearyne  20:19  
known No, no, no, no, it augments that experience. So that way, if, if I come into a project, right, if I'm having to do all these manual reports, going through testing everything, one by one, that's going to take time away from me being able to communicate why this change is important to a client, right? It's gonna take away the time for I am both a front end developer and somebody that's a certified accessibility practitioner, I can be spending time remediating things, if I'm not spending all my time doing these things that could be automated. We'll probably talk about it more in a bit. But there's aspects of like user experience review and user testing. And those aspects will also get more time. The more of this wrote a testable stuff that we can automate.

Stephen  21:08  
Well, let's talk about that now. So what kinds of tests are we talking about?

Blake  21:13  
The big the big stuff that. So qualify has been scanned using multiple different scanners to scan 1000s and 1000s of pages. And there are about 111 points that we have found that are really successful in the automated testing. A good chunk of those, I would say around 15 of those 111 are around Aria and around alt texts. So that's a big one of just like labeling things for screen readers, and providing different alt texts for different media items. That's a huge thing that people just forget about. It's not there. Other issues that we're seeing a lot of around contrast, color contrast. So that's a really big piece of just not having the right amount of contrast for people with low vision. And, you know, and then then we see kind of the more complicated stuff around the way that we're titling things, the way that we're using headline headings, the way that we use descriptive texts, the kind of descriptive text that is being used. And a lot of that kind of more nuanced approach isn't really well tested by these automated tests. But at least you can find the big pieces around alt text, Aria texts, you know, the semantics of HTML is another thing, making sure you have your h1 followed by your h2, followed by your h3, because if you look at people with, we're using screen readers, they're jumping around a page a lot, and they use these tags to navigate. And it's amazing if anybody ever has a chance to watch somebody who really knows how to use a screen reader, it's one of the most amazing experiences, because they're reading it, they're being read the information like a book, but at super fast speeds, sometimes two, three times speeds. And they're navigating through a web page, like we do visually where they're jumping around and saying, What do I find interesting. And if they don't have those, those proper headings, proper links, if you're if if there's a lot of like heavy JavaScript and modals, they're not able to find the information. And just kind of one quick example, or one quick story that somebody from Department of Justice told me was that they were testing a lot of the higher ed websites with people with disabilities and screen readers until they started testing with it. But the websites were so bad had so many modals so many kind of weird pieces, that the people who are using screeners didn't even know they were missing out on content. And it was a huge issue because they couldn't report on errors, because the websites were so bad that the screeners wouldn't even read it. So there was a lot of JavaScript that was put in and put it in an unacceptable way adding modals was was was the biggest issue. So again, the semantics around HTML are really key. And you know, just describing the images in a very descriptive way and adding color contrast.

Tearyne  24:32  
Yeah, a lot of that is stuff that you can catch with the manual tests the semantics, right? But once you get into the user experience review, part of testing, like mentioned is like your visual structure, your logical page layout, menu, functionality, button size, those are things that need a human being to go in and test whether that's somebody with like different vision issues, mobility issues. I have a kind of fun anecdote on that one, when I was working for local government Um, we built out a retirement portal. And we were having folks fill out the form. Now one thing I did not think about, you know, I don't keep my nails done, but I have like kind of smaller ish medium sized hands filling out this form, we went to a job site with some gentleman that had like much bigger hands than I had, in some of them had even the clubs where you could do things tactically, when they were trying to press the buttons. They couldn't, because the buttons were too small for them. And I never would have known that had we not done like a few extra review type stuff with them. So yeah, that's why we get to automate the other stuff.

Nic  25:35  
One of the things I'm curious about too, when you're talking about this is kind of the reporting like, it's clear, we're a long way away from accessibility testing being fully automated, right? But we want to automate as much as we can, because like you said, it frees you up to do some other stuff. But one of the things I've run into fairly often is, especially around color contrast, because you brought that up like false positives. You know, the automated system basically says, Hey, we can tell you this color here. But there's a background issue. So we can't tell what kind of contrast is it needs to be manually reviewed? I find whether you're technical or not, but sometimes, particularly for non technical folks like that very quickly just becomes white noise. If every single page they see there's 30 issues for color contrast. Now, in the example I'm thinking of, there is no color contrast issue, everything has been fully vetted. But how do we how do we make sure that it doesn't become white noise? How do we make sure that we're or how we informing these users? Like when you see this issue? This is specifically what you should be looking at to confirm? Or is there a way to say like, well ignore this unless it changes in the future? Because we've vetted this? We don't we don't want to be flagged on this issue. Again, unless something changes, because then something might be broken. Right. That's

Stephen  26:52  
that's the key right there. That's the key you say I've addressed it. Don't tell me unless something else has changed, right? Yeah.

Blake  27:00  
Yeah. So a qualified let's want to want to that use case, that issue is exactly something that we've built our platform for. So we do have a big Ignore button that then ignores the issues from coming up. And if it's a similar, you know, we have about five different criteria that we look for. And if an issue is reported within those same criteria, it will be ignored, you can always ignore. And then furthermore, if you take care of it, it will be marked qualified for all history. So you can you will never have to remarket. That said, we do see the false positives a lot. And another thing that, you know, we're trying to solve with with our platform is create a environment that you can easily integrate many different types of scans, or many different scan scanning solutions. So what we've done is kind of create our own kind of version of modules, where you can pretty easily integrate any scan, whether it's x wave any of these into the qualify system. And we've been running a few different scans. And what we're doing with these scans, is we're running them against each other. So for instance, we can run little course against wav. And when we did that, we saw that little chorus was producing a lot of false positives that just wave wasn't picking up on. So we were able to flag them as probable false positives. So again, we're that's just an extra level of automation that we can reduce. And it's something that we saw as a unique kind of piece and why we wanted to develop it. Because when, when a company is focused so much on their own scan, and they're not constantly comparing the different scans, they can get into a real big false positive loop where they just don't recognize it. So we're able to not only like, ignore the false positives, but to flag them. And we're working closely with with some of the scanning developers to say, Hey, this is this is coming up again, and again, and again, you probably want to fix this. And so hopefully, we can limit it. Eventually, in our next version, we're going to be putting out better statistics on the false positives we find so that anybody can access it. So we can like publicly shame the the automated scanning services to like, make some corrections and give them that information.

Nic  29:36  
Yeah, I wouldn't. I wouldn't say that. That's publicly shamed them because they're they're doing God's work, but as I say, but I will say one thing that I noticed heavily, in particular, on this project I've talked about on the show before, you know, we use native web components extensively. And these automated scanners do not handle color contrast in native web components. But now one of the great things about A native web components, its semantic HTML, it's in the spec, you know, a lot of the issues that you would get from JavaScript oriented stuff doesn't doesn't exist with native web components. Because, you know, it just dives into the Shadow DOM, and it just kind of works. But color contrast is not one of those, almost every native web component that has background color anywhere just automatically gets flagged, why would

Stephen  30:23  
a web component be different than any other HTML? In terms of color contrast? I don't follow that. In terms of the testing side,

Nic  30:33  
my understanding is it just can't determine what's led the way it's layered. So it just doesn't know if the colors in the background or on top, like it just, it just can't figure out the layering. For some reason,

Tearyne  30:48  
it probably doesn't understand the relation between it and the other entities around it.

Blake  30:53  
Yeah, a lot. A lot of these scanners, and not all of them are just doing a simple curl of HTML on the page, not really any more beyond that.

Stephen  31:04  
That's why I was wondering how a web component would be different. At the end of the day, the HTML that gets rendered on the page is what's tested,

Blake  31:11  
it could have been about when, at the time in which they did the curl, you know, it like it just some some, some some, we tested a few scans that just that scan the page before the page is fully rendered, and all the render blocking scripts they just totally ignored. So it's really about what the scan can do. And when it when it's scanning the page.

Nic  31:42  
So one of the reasons that I really, you know, one of the things that Drupal really pushed for, especially I think in  8 release and moving forward as accessibility, you know, it, it's not 100% accessible out of the box, but it does a lot better than a lot of other CMS is out there. But I guess the next question is like, if something like Drupal, if you're using a tool like Drupal, and it has accessibility out of the box, why do we need accessibility testing then

Tearyne  32:13  
because of all the things we developers do in the aftermath, that, you know, a lot of these services come in with the base, right? I mean, the web itself comes with the base of accessibility. But there's all the neat tricks and tips and cool functionality that we want to build in, that'll create a dynamic experience for our users, right. And each thing that we do to modify that, it changes the experience for the user, it changes how that page is built out. That's why to me, it's important for us to use, you know, Drupal WordPress is and have that accessibility as our base, so that we have a good place to build off from, but then also to be cognizant of the fact that when we create it change, it has an impact, just like it would with any kind of other code or JavaScript thing that we do, it will have an impact on your accessibility as well.

Blake  33:07  
And so much of the accessibility errors are around user added content, I would say, you know, I was talking to someone from Colorado University system. And they said, like, you know, everything that we commit is accessible, it's all good, you know, like, users at the end of the day, and even, I mean, I've seen some pretty locked down environments where you're, there's no wiziwig, there's nothing, just it's amazing what users can do to make things inaccessible. A lot of times, it's around images, it's what I see a lot of, but then PDFs are terrible. And like, that's one of the biggest headaches that come up. So as soon as a PDF is even linked to, like, the content is accessible. And it works

Nic  33:53  
or YouTube, like

Blake  33:54  
her you to

Nic  33:56  
my website is a static website that doesn't have anything on it, except for I mean, one of the only, like, functionality on the site really is just a link to her you to the show, right? The only page that throws any accessibility errors is YouTube. I mean, there's a lot of iframes stuff in there, and I just haven't quite resolved them all yet. But But yeah, like if you when you're adding content, it's gonna be a problem. And one of the things I think people forget is you can make things like all tags required, right? But, but you can't force the user to not just put in a period or put in, you know, the name of the file, which isn't helpful, right? And it also depends on what you're targeting, you know, the the lowest level is just like this, isn't it? You know, a brief description that says like the image is a picture of a person right? But you know, if you're going to fold AAA, you're gonna say like, this is an image of a person looking at object x and the wearing a yellow jacket, right? It's more of an actual description of the full scene. And and that can be honored they did on the testing side or on the entry side whatsoever.

Stephen  35:04  
That's where that manual person looking at the content helps out.

Blake  35:09  
Yeah, and we've seen a lot. One of the most successful use cases I've seen around training users was at Harvard, where before any user could put in any content whatsoever, on a site, they had to go through a pretty extensive training. And it was a training that was built on empathy. So they worked with with and saw people with disabilities, and like, why you would need to do this and why you should describe a raincoat as the yellow and then go into, you know, the bigger descriptions. And I'm, you know, a big believer that if you can build those kind of empathetic tools and show like really like, why you need to talk about your the content on a page in a more descriptive way, you will have better accessibility standards, and your users would file will follow through on it. And most of the sites that don't have the great compliance, are not training their users and do not provide this kind of this type of, you know, very important training.

Nic  36:13  
It also takes a lot of maintenance sometimes. So for example, many times when you build a component, like a front end component, you're saying like this is an h3, right? And if that gets put after an h2, great, but if it gets put after an h4, well, that's gonna be a problem. But giving end users the ability to choose h4 level midstream, and content is I mean, that's just that's also a lot of mental mental load for them. Like they have to think like, where in the content is this? What's the one that's before it needs to be this, or if somebody has the right order, and then they pull a piece of content out in the middle? Well, now all of a sudden, you're jumping one, so so then you have to go and edit everything below it to get it to the correct level. So it takes a lot of maintenance to make sure and a lot of thought to make sure that you're making it easy for content entry people to make that change. And I'm not sure that there's a perfect solution around that. Because I mean, it's a lot of effort for a developer to give complete control there and a lot of effort for the content editor to or, or in Drupal, like this is a Drupal specific example, pages by default or H fours. Right. You don't always want an h4 at the bottom of the page. Sometimes it used to be an h five, sometimes these but like that's something that you're not going to give the end user the ability to change. Right. So how do you find what the right thing? I mean, the right thing is to not have it be a header period, make it like a page URL. But but it is definitely something that takes a lot of maintenance on both things. On both sides, both the developer and the content side.

Stephen  37:56  
Blakey, you mentioned the training that they did at Harvard, do you have any recommendations, or content editor, or admin trainings that we could share with our listeners put in the show notes,

Blake  38:09  
there's a lot of good trainings, I just saw one come out from the Department of Justice. That was that has, they have a pretty good video series that just came out, and the reason why I think it's a good training, they're good videos, but it's also what they're looking for. So if people are worried about getting sued, they can say I've done these training and I've seen this training. You know, they're they're a much more kind of intense training. So you can do the carver's is actually just available for anybody to use, which, which I'll give a link to, as well. And but the best thing that I've seen, and this is for me, personally is like talking with and working with somebody with a disability. And then just saying, like, you know, how do you browse the internet and watching watching them browse the internet, there's a great Munch of number of great videos of just people using the internet. And that's super inspiring. And something again and again, you know, you could have X amount of standards, but until you know that it's going to how it's going to impact somebody and you can really think about that. I mean, one of

Nic  39:19  
the most impactful keynotes that we've had, I think at Ned camp was we had a gentleman from the I think Institute of the blind and Massachusetts come I think his name was Brian, right. Yeah. And he was a keynote speaker and his his talk was basically like, here's an accessible website that I can use. And here's a local website. So you chose some local restaurant that's not accessible, and just showed us the pain that's involved in browsing a site that nobody's paid attention to accessibility on. And I was thinking of that earlier too. And we'll put a link to that because I think we have it in our feed someone will put a link there You know, just being able to see how people experience the web, you know, gives you that empathy, it makes you realize, like, we really need to pay attention to this. And I was thinking earlier, when you mentioned to the two times speed, I felt like he was listening to it like five times speed. I mean, you just, you couldn't hear what they were saying he was already moving to the next thing, like, he knew where he was, where he was going, and how to get there on the sites that he had experience with, which is same to us. Like, if you're on a brand new site, you're gonna spend some time kind of looking down the page, finding what you want, but if it's just us every day, you know, you just go on and you hit the scroll wheel just right in that, like, if it's a long page and ends up right where you want to be. It was the same experience for him. But auditory, and at a speed that I couldn't even understand what the what the the text to speech was saying.

Tearyne  40:56  
I really liked this conversation thread around empathy, like being part of the training, because I feel like this relates to a lot of different factors of inclusion, right? We've all felt that at some point, because we're human beings, where there's something that's quote, unquote, extra that we're being asked to do. And it's like, Well, why do I need to do X, and putting the empathy of understanding what the other person's experiences when we don't do X? I feel like that makes a lot of us more inclined to understand the importance of it. So it's great.

Blake  41:28  
And it's also I mean, so much accessibility that I see the bad accessibility remediation that maybe meets compliance dumbs down web content, rather than, like makes it able for for people with disabilities to use. Because, like, again, if you look at someone with a screen reader, like their ability to understand information is amazing. It's just like, straight up amazing. And maybe they're understanding it in a different way or using different senses. And we are but it we don't like I've seen websites and web site managers say, Oh, well, let's, let's cut a lot of the content off because like, we don't want to bother with dealing with the headings of this content and things like that. And that doesn't like that, that might solve the compliance issue. But it's not doing service to the end users. So it, it is like, things don't need to be dumbed down to be accessible, they should just be able to be used by different senses. And that's a key component that I say,

Nic  42:38  
I do want to say, though, that it it. Sometimes cleaning up the content is the solution, though, too. So like don't, you know, sometimes people put a lot of extraneous information. And so sometimes, you know, cleaning up the content and trimming it down is part of the solution. But that shouldn't be the only thing that you do, but it definitely can. And that can help for the people that aren't using screen readers as well. I mean, if you have, you know, 300 words on the page when you only need 50. I mean, that's an accessibility issue for everybody. Right? If you Who has time to read 300 words, when when when 50 will do right? You shouldn't be didn't handle that. Does that to be patronizing? Right? Yeah,

Blake  43:24  
the exact example that I'm thinking of was literally a website manager, not wanting the images like putting really good vital images of graphs and things. Because graphs are really hard to articulate with screen and screen readers. And it's still an issue that, you know, I've been working with for a long time. And literally cutting out these graphs, these data visualizations, because they don't have the time to figure out better ways to articulate the information, like, and then maybe, like, literally dumbing down the content, because they do not have the data that should be on the page. And we see I see it a lot in higher ed institutions. And it just drives me nuts. Because like, that's the meat and potatoes of these. These websites,

Nic  44:10  
it's de

finitely not the solution. Okay, so So when, when you're talking about the web ops workflow, where where does this get put in? Because normally automated testing for DevOps that happens during the CI pipeline before deployment or before deployment to an environment? Where in the web ops workflow, does accessibility testing get put into the process?

Blake  44:36  
So the way where I see it is, you know, there is the ideal way, which is every step along deployment process, whenever you consider deploying where it's like even committing, I would say, you know, like, it's just when you develop a new feature and you want to make sure it works. You know, there are a lot of good CI CD tools that can work when you're merging. You Your code and getting it ready. But where it goes into the web Ops is when you publish content and before that content hits is promoted or marketed is the best time that I see folks utilizing it at Tulane University where I am right a second. A lot of the checks happen literally before user can publish, push the publish button. So it'll go through the automated checks before they can put a Publish button. I've seen a lot of times marketing departments deploying it very successfully. And before they market any piece of content, that's when they're going to go through the checks. But that's, that's not the ideal because you're only checking a small percentage of the content, because it's only the content being marketed. It really should be every step along the way. There isn't like a single like, now it's a QA, accessibility time, because it changes so often. Ideally, you're also employing kind of regular audits across the entire web estate. So some of the more successful universities, I see using it have an annual or quarterly audit of their have their end content. And that's really successful, but it's, you know, it's, it's an it's a constantly moving target. So it should be constantly tested. And

Nic  46:26  
and there's really no like, this is the time this is the QA time. Yeah, it's, it's kind of like technical debt. And on some respect, you know, that it accumulates over time, like sometimes like as, as vigilant as you are, some things will slip through the cracks, or, or standards will change or people as understandings will change. And, you know, there's a maintenance aspect to it. But I think your comment that it needs to be done regularly is is a good one. I mean, I have a client, it's a discussion we've had on the show, over the past couple of years, where accessibility a few years ago was the line item, right? And, you know, over the last couple of years, it's just become you make accessible sites, if you're a modern web developer, right? It's not a line item anymore. Now your prices reflect the additional work that goes into it. But it's not a line item, it's not something that a client can be like, Well, I don't have the budget for that. So I'm not going to do that. No, accessibility is part of the budget. But I'm seeing a change a shift another shift. Now with some clients were, even when it as it shifted to not a line item, it still was a you build the site, and then you do the accessibility pass before you go like like there's a there's a QA phase, and there's an accessibility phase, right? I'm seeing clients just moving that up. And like saying, like, you know, we do the accessibility review in the design phase, we do it in the integration phase, we do it in the development phase, because if you catch those issues earlier, it makes each step easier to maintain that accessibility level. If you wait to the end, and you go, you know what, these buttons are all too small? Well, now you have and then just making them bigger might make the designs look weird. So if you had to address that during the design phase, then everything would have been fine. And that never would have gotten flagged would have been fixed early in the process. So I'm seeing it in a much earlier integration with a lot of clients with accessibility side, which is encouraging.

Tearyne  48:18  
I'm definitely seeing it in a more holistic way as well. Where one of the first, so my background was in local government, and then eventually I switched over to finance I got called in to do a remediation. Because they had seen the talk that I did about accessibility that we had at the local government side. And being a financial institution, especially they need to make sure that everybody has equal access to lending, right. And they had a visually impaired user that could not navigate could not update their forms. And not only wasn't my job to remediate on the site, but also to teach the QA team how to test for accessibility. So I had on a lot of hats. But it was really great to see even the QA team catching things that I had not taught them about yet because they went through that process. And so not only the project that we were remediating, but all other projects that came through, they started putting those standards, so anything new that came in, was going through those tests, so

Stephen  49:18  
yeah, so, um, Blake, you mentioned that you would love to see this integrated into all aspects of the process, like I think you've been mentioned commits. So I'm trying to imagine in my head, like, what accessibility test would run when I did a commit.

Blake  49:39  
Well, with a commit, I was actually more thinking about what your commit message is. Okay. Yeah, because, you know, some of the users we qualify as a totally open source AGPL, like friendly GitHub repo, and we have users that I'd like to read all the commits. And I do want to know what you're doing and how you're doing and how you're testing it. And for instance, here, this is a direct example of emojis and commits, emojis are not accessible. And so little things like that, just making sure that like you're thinking about what I'm talking about, it's more like thinking about accessibility, every step of the way. And, you know, it should be part of every developer designer and users thought process throughout everything, and including in the commits. Now, we do so like when I'm committing pieces, if it's accessible commit, I'm thinking about, like, how I'm branching it, how I'm relating it to different issues, and different things like that. So that users with disabilities that I want to test these commits, can think about that. So that's an extra level, but I would say like, just on a basic level, like just not including emojis and things like that, to describe information is starting to think about accessibility and commits.

Stephen  51:12  
So in my, in some of the projects I work on, we have like Git hooks that trigger processes to run when I do a commit. And I was trying to like, imagine what an accessibility one would look like. Because I think it's very content, heavy, like to do a accessibility test. So that's where my question was, is like, how, how could I do that on a commit level to do accessibility testing? I think that's difficult.

Blake  51:44  
Yeah, I mean, I've I've seen, the automated accessibility testing usually happens before deployment point. And so I don't, I don't think that it's usually like a CI CD kind of test for the automated accessibility. But again, I think what we have been like kind of pushing hard is that accessibility shouldn't just be about the automated tests that we're thinking about, it should literally be about every step along the way, in the way that you think about your end users every step along the way, and who is actually using the code you're writing it, we should think about, like, how those end users with disabilities they might have, or even if they don't have disabilities, you know how your end users can can understand what you're what you're putting out there.

Tearyne  52:35  
There, there is axe linter tool that you can run with GitHub actions, and I'll run it on pull requests and look for accessibility vulnerabilities. So there's that that you can use in the process. And link,

Nic  52:49  
that that's an interesting framing to calling it a vulnerability, like you're starting to treat it like security. And here, it's like a user, it's user, a usability vulnerability, but it's also a legal vulnerability. I'm gonna have to think about framing accessibility to some clients like that.

Stephen  53:12  
And, and for our listeners, when you come back to the show notes, I think we probably have the most resources, or close to it we've ever added for any show.

Tearyne  53:27  
So much to learn.

Nic  53:28  
Yeah, absolutely. So if we're talking about these tests, again, whether it's happening in CI are happening before, what kind of results are you getting from them? Are they pass fail? Are they percentage based thing? Are they just output to read? Like, what can you expect as a response from these types of tests?

Blake  53:52  
So one of the major thing, so you're testing largely against WCAG 2.1 compliance standards. And you're seeing a lot of pass fails related to the test that you can automate it, you can, you can automatically test against compliance standards, and you then you there's additional kind of testing around contrast issues. And some of depending on what scanner you're you're using, you might just say you failed, figure it out, figure out the solution, or you failed, and here's the here's the tests that you failed, and here are some possible solutions for it. So for example, like wave does a really good job of, of listing out the guidelines that are related to the test, and then hoping that he will piece together the solutions on that by following those guidelines. Yep, go ahead.

Nic  54:55  
I was just saying I think that's my biggest complaint about accessibility testing is like you get, like I Whether use x wave like I feel like it's always like, here's the issue, like x will link to the GQ article in the specification, but it's like, it was something like color contrast is it's fairly straightforward. You take the two colors, you put them into some contrast checker. And it's like, okay, what am I targeting? Doesn't have to be four to 1821 12. What does it have to be easy? Right? Now, the automated test couldn't figure out what it was fine. But then there's ones like. Yeah, there's other ones like other false positives, that's not as easy like, well, I don't think it's an issue. But I don't have like there's, it's very difficult sometimes to map the specification to the actual implementation that you have, and know for sure whether it's a false positive, or it's something that you actually have to fix and address. And, or even if you know, it's our false positive, sometimes like, Okay, how do I address again, if it's like, oh, it's an h1, and then there's an h2 and then an h1? Again, the way to address that is okay, I need to make that second h1, not an h1, or I need to wrap it in a section or an article or something like that. Right. But sometimes, the spec is just okay. Where do I begin to address this? I think that's one of the biggest complaints that I get from other developers, it's a complaint that I have myself, how do you go about making that determination? Is the answer just you need to find an expert? And or, and ask them? Or is there a resource for, like, make bridging that gap?

Tearyne  56:31  
I'm gonna say, I think it's a blend of the two. I think that when I started doing the remediations, at the financial institution, a lot of what I learned from were the teachings of the late Rachel Albero, who was in charge of a community, and we named Albero, after I suggested that name, Mo bot, because she had helped me so much, even not being there, but being able to consult. So saying, to go do the research that you can on a thing. That's the first layer, and then relying on your communities. There are a ton of accessibility communities built in our own Drupal community. And there's accessibility community on Twitter, I'm sure that there's some in the WordPress arena as well. That's your second line. And once you get to a spot where you're not sure, based on even the knowledge that you have, just like any other like JavaScript, or Drupal issue that we'd have, then you would like want to see about reaching out to someone, either a professional or like getting a consultation. That's the method that I would suggest for tackling those types of issues.

Blake  57:49  
And there's, I think, there's a lot of room to, to get better on on this, this this particular front. So something a complaint that we've heard over and over again, is exactly what you were talking about, which is like, we have to do an extra step, before we actually do the remediation, we have to go search for what the guideline means how it means and how it works within the semantics, and we have to search for like, you know, we might have the the, the theme or template might be where the issue is, but we don't really know exactly where we need to go search for the issue. So one of the one of the things that that that we've been trying to do with qualify, and I literally just posted a issue about this just a few hours ago. So it's hot in my mind is making that type of remediation more in line with the reporting. And that requires like an extra level of trust from a developer because you're giving the ability to update, whether it's its core code or any kind of, you know, database stuff through a dashboard around the testing. And so the person who's doing the test needs to have the level of access that would allow them to to update the code. That's really the big holdup is like those two access levels are usually different. We have been testing a pretty cool solution using with WordPress because WordPress, you know, has a pretty easy history kind of function built in where we can leverage WordPress is core like post history and just publish the updates to a to a user post. And so So if if the developer if the website owner wants to rollback an update, they can do that using the WordPress like core history component and that's very possible at Drupal two It was just a low hanging fruit to push out a plugin that did that. Interesting.

Stephen  1:00:07  
So we've been talking about the some of the accessibility tools you can use to test your website, we've mentioned wave and x, what are the premium services that you would be using nicely premium in a way to say the most common or the best maybe services you could be using to test your website? I'm also interested to know, what are the free services you could use to integrate into your web apps.

Blake  1:00:41  
So premium as far as cost and and effectiveness are two different things and the accessibility universe I will not be come here to throw anybody under the bus. But there are there are, I will just say that the the price tag that that someone is quoted on the accessibility service doesn't mean that they're trying to get a the features that that should correspond to those price tags. There are so folks that we've worked with before little forest is a good one. Debatte is another one SiteImprove is excessively service. Polk tech, which is uses wave wave is another one, there are tons of them. qualify, of course, is another platform, but it qualifies kind of taking more of an objective approach or integrating a lot of them together. There are a lot out there. And I think that the the choice really comes to how you want to use your accessibility service. So if you're a developer, and you're comfortable with running CICD kind of runtime stuff, then then you know, maybe x is going to be a good solution for you. Because you can get in there and really utilize it the way you want to or waves API, Google dev tools, the Google lighthouse is really easy, low hanging fruit automated service. Again, though, the caution is that like someone says their site's accessible because they pass all the tests that Google Dev Tools say they should, they should do. So I would highly recommend if anybody's listening to this and wants to like immediately get into accessibility and testing, to read the WCAG guidelines, watch some of the great videos on on testing. And that's really where you should start not necessarily the automated tools themselves, I was just gonna say one of the best resources to compare and look at accessibility testing, and also get a good understanding of it is the UK Government published an accessibility to a lot of that is really, really good. And it's really, really good not only because it goes through about 13, different accessibility services, and really kind of list out the features and the comparison. But because all of that comparison is put on in the context of the best service can only find 40% of the issues. So it's a it's a great thing to show clients, because it is the UK government's on a UK Government website, I guess I can give you a link to put in the growing show notes. But it's, it's really good, because it does say, you know, here are the options. But the options are only really this good.

Stephen  1:03:33  
So I want to ask one more question related to this is that when we're talking about web ops, it implies that the services we're talking about, have API's that you can interface with to get feedback from. So like when you mentioned something like SiteImprove, which is a site I'm familiar with for accessibility, do they have the API that you can integrate with in a in a webops scenario to get the feedback that you need? Or is it more of a testing tool?

Blake  1:04:09  
Well, SiteImprove, particularly depends how much you're paying. Yeah, that there and I would kind of also push back a little bit on like the idea of like, just web ops for API's, because they're integrations beyond API's that allows something to work in the webops. I think that as long as you can. That like as long as there is an ability to integrate with other services as far as just like workflow, like I'm thinking about some of pantheons tools. Like there are so many great tools that Pantheon has that aren't accessible through Terminus that are still part of the web stack.

Nic  1:04:55  
Yeah. Speaking of Pantheon, love this segue. How do Is Pantheon support Web ops and in particular accessibility with web ops?

Tearyne  1:05:05  
Well, with web ops, we are supporting not just like the DevOps and developer part, but also the performance of your site, which goes from like, how fast is this going to how is my content even being served service, we do have an accessibility statement on our website. So pantheon.io/accessibility, that statement, and it talks about our commitment to ensuring that our ongoing accessibility work is adhering to level double a criteria of the vocab, 2.1 standards. Some of the ways that that is showing up in our work now is, you know, thinking about this, not just from the developer hat, I'm having to remind myself to take my developer hat off, right. But a good a good section of the people who access our VB dashboard are people who are not developers who are not going to put code in, we're just coming in to click through and get to the multidose. To see, what does this look like? What has this been solved? Multi dev environments have, even when I was doing some freelance work, working on Pantheon, being able to say, hey, you had this issue with when this goes responsive, or this not having the color contrast issue. Here's the multidev that you can go to to see where I've solved this issue. But then another issue is solved somewhere else, because we know that fixes on one item can affect another. But yeah, it's interesting,

Nic  1:06:29  
too, because that's a common thing to like, developers, especially in the last five years or so have made a real push, I think, especially in the Drupal community to make the front end, accessible. And Drupal in particular, you know, the admin interface is fairly accessible as well. But many times, we don't care about our tools. Or to, you know, we might make the most accessible website, but the tooling around the website, isn't it? It's good to hear that Pantheon kind of as a statement about that. And is making a push to make their tool accessible as well.

Tearyne  1:06:59  
Yeah. For Pantheon, I've worked on numerous projects, where I'm like, Yes, I made it accessible on the developer end. And when it spits out the thing, it's great. And then the Secretary that I was designing tool for, with no like visible disabilities, right comes in and says, Oh, I don't know how this works. I don't know how to put this content the way that it should be. So that it can be accessible so we don't get sued. So even that part of the user experience is important.

Stephen  1:07:25  
Blake, you we've talked about equalify app a few times here, but actually haven't talked about it. Like, what there is equalifying app.

Nic  1:07:37  
Yeah, this should have been at the beginning of this show.

Stephen  1:07:43  
I figured we'd save it for like the end like the big. The big reveal, right?

Blake  1:07:48  
Yeah. Well qualify is a project and a community to build a open source accessibility platform. And, you know, we've been talking about all these different accessibility tools, SiteImprove, wave all these different things. And we're trying to bring it together into a single platform approach, where a user can log in to qualify, or, you know, login via their local machine version of a qualify, and instantly integrate and, and activate all the accessibility tools that they know and love. So the different accessibility scanners is what we've integrated so far, but also add in Drupal, modules and extension so that they can start working with Drupal. Right now we work with Drupal, sitemaps. So, you know, that's the first kind of key into into Drupal, but we can do things like add governance standards to requalify in the future, or do remediation directly from a qualify. So we're an open source project, again, trying to create a platform that is easily extendable to any accessibility service. And that becomes that kind of one key area that anyone who's doing any kind of accessibility remediation or testing is going to use. And we're slowly building in different integrations. We're actually very quickly building integrations to different testing services, and also different web services like Drupal, WordPress. Backdrop we were talking about the other day.

Nic  1:09:30  
I feel like I feel like we shouldn't be back on the show, maybe in a couple of months to talk just about to qualify, because, I mean, just a little bit of peaks here and there that I've heard you talking about it? Definitely. I think it definitely deserves its own show. So I'll talk to John about that.

Tearyne  1:09:46  
I met Blake at bad camp. And so we're sitting at the Pantheon booth, and I'm like, I love accessibility. And here's why. And he's like, I've made this thing for accessibility. I'm like, I was always wondering why this does not exist. And we're just like sitting there losing antibiotics. testability so I'm so excited that you're able to come on and hopefully come back on again.

Blake  1:10:04  
Thanks for inviting me. I love talking. So civility. I love hanging with all y'all. Great.

Nic  1:10:11  
It's been another great show. Well, Taryn

, thank you for joining us for the last four weeks it's been great having you on as a guest host and you know, as I extend the same invitation to you if you have other topics, let us know because we always love to hear from you.

Stephen  1:10:26  
If you have any questions or feedback, you can reach out to talking Drupal on Twitter, with the handle talking Drupal, or you could email us at show at talking drupal.com You can connect with the hosts and other listener listeners on Drupal slack in the talking Drupal channel.

Nic  1:10:45  
And you can promote your Drupal community event on talking Drupal. Learn more at talking to Patel comm slash TD promo.

Stephen  1:10:53  
You can also get a talk in Drupal newsletter for a show news upcoming camps local local meetups Chad's book corner in much more information. Sign up for the newsletter at talking to people.com/newsletter.

Nic  1:11:08  
And thank you patrons for supporting talking Drupal, your support is greatly appreciated. You can learn more about becoming a patron at talking to people.com and hit the button on the side called Become a patron. So Taryn, if our listeners want to get in touch with you, what's the best way to do that?

Tearyne  1:11:22  
Yes, on Drupal slack and on drupal.org. I am ninelivesblackcat. And on Twitter, I am @TearyneG

Nic  1:11:32  
And Blake if our listeners want to get in touch with you,

Blake  1:11:35  
I'm on the triple slack as well. And on Twitter at @bbertucc

Nic  1:11:43  
and Stephen, how about you?

Stephen  1:11:46  
The only way you can get in touch with me is to come to New England DrupalCamp November 18 and 19th in Providence, Rhode Island.

Nic  1:11:54  
And listeners can get in touch with me @nicxvan and pretty much everywhere.

Tearyne  1:12:01  
If you've enjoyed listening, we've enjoyed talking.

Nic  1:12:04  
See you guys next week.

Transcribed by https://otter.ai


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK