🎧 LISTEN TO THIS ARTICLE
▶️ LISTEN TO THIS ARTICLE
ManyPets, a London-based pet insurer, now routes every single claim through an AI agent called Millie. 55% of those claims reach resolution without a human touching them. Not reviewing them after the fact. Not signing off on a recommendation. Zero human involvement, from submission to payout.
That statistic alone would be worth watching. But it landed in the same year that the Royal College of Veterinary Surgeons ruled that UK vets no longer need to physically examine an animal before prescribing medication, that a vision transformer hit 92% accuracy identifying cat breeds from photos, and that multiple research labs demonstrated quadruped robots capable of guiding humans through cluttered environments on a leash. Each development is interesting in isolation. Together, they sketch something that nobody in UK pet care seems to be planning for: a fully autonomous pipeline from symptom detection to diagnosis to treatment, with minimal veterinary oversight.
The Eyes
Computer vision for animal breed identification sounds like a novelty app feature. It isn't. Breed identification is the first step in a diagnostic chain. Different breeds carry different genetic predispositions. A Cavalier King Charles Spaniel has a near-certain lifetime probability of developing mitral valve disease. A Persian cat's brachycephalic skull creates chronic respiratory risk. Getting the breed wrong means getting the risk profile wrong, which means missing the condition that kills the animal.
The technical performance has quietly gotten serious. A February 2026 paper from researchers using the Global Context Vision Transformer architecture, GCViT-Tiny, achieved 92% test accuracy and 94.54% validation accuracy on fine-grained cat breed recognition. That's across dozens of visually similar breeds, using a model small enough to run on a phone. The Oxford-IIIT Pet Dataset and Stanford Dogs Dataset have become standard benchmarks, and the latest models trained on them are pushing past the accuracy threshold where veterinary professionals start paying attention.
PitPat, a UK company, already ships a GPS and activity tracker that clips to a dog's collar and monitors exercise patterns, sleep, and calorie burn. Felcana, a London startup founded by two Royal Veterinary College graduates, built a wearable health monitor that tracks activity 24/7. They raised £100,000 from Innovate UK and secured seed funding from InseadAlum Ventures. These devices don't just count steps. They're building longitudinal health datasets, one collar at a time, that will eventually feed the diagnostic models that decide whether your dog needs a vet visit.
The Brain
The insurance layer is where AI agent autonomy gets real, because money forces accountability that research papers don't.

ManyPets (formerly Bought By Many) processes claims through Millie with a speed that makes traditional pet insurance look archaic. Across the Atlantic, Trupanion's ML system, trained on millions of historical claims, processes invoices at veterinary hospitals in five seconds. Over 60% of their direct payments complete within 60 seconds. Five Sigma's Clive AI platform claims a 90% reduction in handling time for pet insurance claims through straight-through processing.
These aren't chatbots answering FAQ questions. They're autonomous agents making financial decisions about animal healthcare. When Millie reviews a claim, she's evaluating the diagnosis, cross-referencing the treatment against policy terms, checking for pre-existing conditions, and authorising payment. The 55% automation rate means the system is confident enough in its own judgement to skip human review for more than half of all cases.
The UK pet insurance market is still small compared to human health insurance. But 11.1 million dogs and 10.5 million cats live in UK households, according to the 2025 PDSA PAW Report. That's a population large enough to generate the claims volume these models need to improve. And improve they will, because every claim Millie processes without error makes the next automation decision easier to justify.
The Legs
Nobody has built an autonomous dog walker. That needs to be said clearly, because the gap between current robotics research and "a robot walks your Labrador around the park" is enormous.
What does exist: quadruped robots that can guide humans through physical spaces using leash-based force feedback. A team at UC Berkeley demonstrated a leash-guided robot dog that physically leads a person through narrow, cluttered environments. The system communicates direction changes through tension in a rigid leash, the same way a real guide dog does. At CHI 2024, researchers presented RDog, a quadruped guiding system for visually impaired people that combines SLAM-based mapping with voice feedback and kinesthetic cues. Users navigated faster, with fewer collisions and lower cognitive load than with traditional aids.
Binghamton University researchers trained their robot to navigate indoor environments and detect leash tugs using reinforcement learning, achieving functional performance in about 10 hours of training. The AI robot dog market, broadly defined, hit $1.31 billion in 2024 and is projected to reach $2.46 billion by 2033.

These projects all focus on guiding humans, not walking pets. The engineering challenge of controlling a live animal on a leash is categorically different from guiding a cooperative human. A dog that spots a squirrel generates forces and behaviours that no current quadruped robot can handle safely. But the navigation stack, the obstacle avoidance, the terrain adaptation: that's transferable. The question isn't whether the hardware can walk a predetermined route through a park. It can. The question is whether it can manage an unpredictable biological agent at the other end of the leash.
The Regulator Blinked
In what the American Animal Hospital Association called a "watershed decision," RCVS Council voted 20-3 to allow veterinary surgeons to prescribe medication based on remote assessment alone. No physical examination required. The legal interpretation: "clinical assessment" under UK Veterinary Medicines Regulations includes video consultations.
The profession pushed back hard. A 2021 RCVS survey found 73% of respondents agreed that a recent physical exam is essential for an animal to be under veterinary care. 80% disagreed that regulations should allow remote video prescribing for animals never physically examined. The British Veterinary Association expressed disappointment publicly.
But the ruling stands. And it creates a regulatory gap that AI systems will fill whether anyone intends them to or not. If a vet can prescribe based on a video call, and a computer vision system can identify the breed and flag symptoms from the same video feed, the vet's role shrinks to a rubber stamp. Combine remote prescribing with AI-powered symptom detection, automated insurance claims, and breed-specific risk profiling, and you've got an end-to-end pipeline where the animal never sees a qualified professional in person.
The RCVS carved out one exception: antimicrobial prescribing for individual animals still requires a physical exam "in all but exceptional circumstances." That's a sensible firewall. But it only covers one class of medication, and it doesn't address the diagnostic side at all.
What's Actually at Stake
The UK has 11.1 million dogs and a veterinary workforce that was already stretched thin before the post-pandemic pet ownership boom. The average UK household spends more on pet food than on tea and coffee combined, in a market worth over £4 billion annually. The demand for accessible, affordable pet healthcare is real and growing.

AI can help. A wearable that catches a heart murmur early saves the dog's life and the owner's savings. An insurance agent that pays claims in seconds instead of weeks reduces genuine suffering. A breed identification system that flags genetic risk factors before symptoms appear is straightforwardly good.
But the pieces are assembling faster than the oversight. Nobody at RCVS is modelling what happens when Felcana's collar data feeds into a diagnostic model that triggers a remote consultation where a vet rubber-stamps an AI-generated treatment plan that ManyPets' Millie pre-authorises before the appointment ends. Each component works. The integration hasn't been tested, regulated, or even seriously discussed.
The UK pet care system is becoming an accidental testbed for autonomous AI agents operating on biological subjects with no formal framework for how the pieces connect. The animals can't file complaints.
Sources
Research Papers:
- Fine-Grained Cat Breed Recognition with Global Context Vision Transformer — GCViT-Tiny (2026)
- Robotic Guide Dog: Leading a Human with Leash-Guided Hybrid Physical Interaction — Li et al., UC Berkeley (2021)
- Navigating Real-World Challenges: A Quadruped Robot Guiding System for Visually Impaired People — CHI 2024
- Understanding Expectations for a Robotic Guide Dog for Visually Impaired People — 2025

Industry / Data:
- PDSA PAW Report 2025 — UK pet population statistics
- ManyPets Sends All Claims Through AI Agent — Insurance Post
- Trupanion AI Technology Empowers Pet Parents — Trupanion (2023)
- Five Sigma Clive AI Pet Insurance Claims — Five Sigma
- Felcana: Smart Device for Pets Improving Veterinary Care — Innovate UK KTN
- AI Robot Dog Market Size 2025 — Business Research Insights
Regulatory:
- RCVS Under Care Guidance — Royal College of Veterinary Surgeons
- What the UK's New Telemedicine Rules Mean — AAHA
- In Watershed Decision, UK Relaxes Telemedicine Rules — VIN News
Related Swarm Signal Coverage: