Focus Order Fundamentals: Why Landmark Structure Still Matters in 2024

I've been auditing web applications for over a decade, and there's something both encouraging and frustrating about this Acme Corp job application form. The developers clearly understand accessibility fundamentals — every form field is properly labeled, the heading structure flows logically, and interactive elements have accessible names. Yet they've missed two critical pieces that fundamentally impact how screen reader users navigate the page.
This disconnect illustrates a broader pattern I see in development teams: strong tactical implementation paired with gaps in structural understanding. Let's break down what's happening here and why it matters.
The Missing Foundation
The audit reveals two landmark violations that might seem minor but create significant navigation barriers:
- No main landmark — Screen readers can't jump directly to the primary content
- Missing header/banner landmark — Users lose a key navigation reference point
Here's why this matters more than you might think. When a screen reader user lands on this page, they typically orient themselves by exploring the page structure first. They'll often press a key to list all landmarks — think of it as getting a bird's eye view before diving into details.
On this job application, they'll find a nav landmark but no main or banner. It's like walking into a building where someone removed the lobby directory but left the individual office numbers intact.
The User Experience Impact
Research from the Pacific ADA Center (opens in new window) consistently shows that navigation efficiency directly correlates with task completion rates for disabled users. When structural landmarks are missing, users spend cognitive energy on navigation that should be focused on the actual task — in this case, completing a job application.
Consider Sarah, a screen reader user applying for this position. Without a main landmark, she can't quickly skip past any header content, navigation, or promotional elements to reach the application form. She has to tab through every focusable element or use other navigation methods that are less efficient.
More concerning: she might assume the page hasn't loaded properly or that there's no main content area. The Implementation Crisis research documents how these structural gaps create compound barriers that discourage disabled users from completing online tasks.
What Developers Should Fix
The good news? These are straightforward fixes that don't require redesigning the interface.
Add a main landmark:
<main>
<h1>Apply for a Role</h1>
<!-- rest of application form -->
</main>
Include a header landmark:
<header>
<!-- site branding, primary navigation -->
</header>
These changes satisfy WCAG 2.1 Success Criterion 2.4.1 (Bypass Blocks) (opens in new window) and dramatically improve navigation efficiency.
The Bigger Development Pattern
What's interesting about this audit is how it reflects common development workflows. The team clearly ran through form accessibility checklists — every input has proper labels, buttons have accessible names, heading hierarchy flows correctly. They're following tactical accessibility guidance well.
But they missed the structural layer. This suggests their accessibility testing process focuses on individual components rather than page-level navigation patterns. Our research on organizational accessibility maturity shows this is typical of teams in the "Component Compliance" stage — they understand element-level requirements but haven't integrated page structure thinking into their workflow.
Building Better Testing Workflows
For development teams facing similar gaps, I recommend expanding your accessibility testing to include structural validation:
- Landmark audit first — Before testing individual components, verify the page has proper landmark structure
- Screen reader navigation testing — Have someone navigate using only landmark keys (R in NVDA, ; in JAWS)
- Automated structural checks — Tools like axe-core can catch missing landmarks during development
The False Promise of Automated Accessibility Testing research emphasizes that while automated tools catch these structural issues well, they need to be integrated early in the development process, not just at final testing.
The Operational Reality
From a capacity building perspective, this case represents a common organizational challenge. The team has accessibility knowledge and is applying it consistently at the component level. They're not starting from zero.
The gap is in systematic page structure thinking. This typically gets resolved through:
- Template-level fixes — Adding landmark structure to base templates ensures consistency across pages
- Design system updates — Incorporating landmark patterns into component libraries
- Testing process refinement — Adding structural validation to existing accessibility testing workflows
These are operational improvements that build on existing knowledge rather than requiring fundamental skill development.
Moving Forward
This Acme Corp job application demonstrates something important: accessibility implementation is often about connecting existing knowledge rather than learning entirely new concepts. The developers understand form accessibility, heading structure, and interactive element requirements. They just need to extend that thinking to page-level navigation patterns.
For organizations seeing similar patterns in their audits, focus on building systematic accessibility processes that address both component-level and structural requirements. The tactical knowledge is there — it's the integration that needs attention.
The encouraging part? Once teams understand landmark structure, they tend to implement it consistently. It becomes part of their mental model for page construction, not an afterthought during testing.
About Marcus
Seattle-area accessibility consultant specializing in digital accessibility and web development. Former software engineer turned advocate for inclusive tech.
Specialization: Digital accessibility, WCAG, web development
View all articles by Marcus →Transparency Disclosure
This article was created using AI-assisted analysis with human editorial oversight. We believe in radical transparency about our use of artificial intelligence.