Bot gov

Optimizing digital government for AI and other machines.

By: Luke Fretwell

Posted: November 30, 2025

Estimated read time: 4 minutes

In the same way we think of humans accessing a website, as the internet shifts to bot-dominant traffic, governments must now prioritize machines in their digital strategies.

Bots 101 bots 101 link

Bots – also knowns as crawlers – are computer programs that visit websites to pull or evaluate information.

Search engines like Google, Bing, and DuckDuckGo use bots to crawl websites and determine how they show up in search results. As search engine technology became ubiquitous, the need to make websites more machine-friendly became important, and the practice of search engine optimization (SEO) emerged.

Artificial intelligence tools like ChatGPT and Claude use bots in the same way. However, rather than displaying simple search results, they use information that’s been crawled to deliver in-app responses to chat prompts.

Researchers, journalists and technologists also use bots to scrape public information to make it more consumable, particularly when data isn’t easily accessible – improperly structured, poorly formatted or not easily retrievable.

Machine experience machine experience link

The term “machine-readable” generally refers to a digital product’s technical accessibility and properly designed information. A site must give bot access and, once accessible, provide content in a structured way.

Much of today’s digital experience focuses on how humans interact with a website or application. This includes human-centered design practices that focus on user experience and graphical interfaces.

With the emergence of artificial intelligence, tools like ChatGPT and Claude rely even more on machine experience. Because of this, governments must become more proactive in adopting practices and policies that support bot accessibility.

Why it matters why it matters link

Understanding bots protects website infrastructure from abuse and malicious activity, balances transparency requirements with security needs, and ensures public information and services are still optimized for access.

The challenge the challenge link

As bot traffic increases, new burdens are placed on government technology resources, particularly server overloads that can cause websites to be less performant or secure. This also adds to financial costs that, if not properly managed, could be substantial.

The solution the solution link

To better manage bot access, governments must implement bot policies that are transparent, practical and technically aligned with open and good public service principles.

Bot traffic bot traffic link

Website traffic from bots is steadily rising. According to Cloudflare, bot traffic makes up nearly 30% of all traffic. By location, nearly 45% of that comes from the United States.

Bot access bot access link

As a general rule, governments should start conservative, blocking clearly malicious bots immediately, and refine over time.

  • Allow most crawlers except known malicious ones
  • Heavy rate limiting instead of blocking
  • Focus on preventing abuse, not all bots

Bot documentation bot documentation link

Legitimate AI tools publish their technical information about their crawler behavior, such as OpenAI’s GPTBot and Anthropic’s ClaudeBot. And Cloudflare outlines its access policies for sites using its services.

Similar to how these companies expose their crawl behavior, governments can do the same with policy, creating bot governance on the terms for access.

Public (bot) policy public bot policy link

As with any policy, government should openly publish bot governance. This makes it more transparent and easier for the public to understand what public access looks like to machines and, ultimately, the people using those services.

At minimum, bot policies should include:

  • Whitelist/blacklist
  • Rate limits
  • Verification requirements
  • Review processes
  • License/terms of service
  • Contact information

Bot standards bot standards link

Government relies heavily on the Web Content Accessibility Guidelines (WCAG) for information access standards related to people and assistive technologies (like screen readers).

We now need Web Bot Accessibility Guidelines, a WCAG equivalent for machine access. Creating governance that include accessibility standards for bots will ultimately make public services better for everyone.

Build your bot policy build your bot policy link

I’ve started work on policy and implementation checklist templates (with Claude’s help) to help governments better implement and share their bot access governance. These templates are available in the GovFresh bot governance GitHub repository. Feel free to submit issues or pull requests.

My goal with the templates is to create higher expectations for vendors who support the public sector, but also help governments be more thoughtful and transparent in how anyone can access information and digital services, whether they’re human or machine.

Photo of Luke Fretwell

Luke Fretwell

Luke Fretwell is the founder and maintainer of GovFresh and co-founder of ScanGov.

More about Luke