Exploring Holo1: The Future of Action Vision Language Models
At H Company, the landscape of AI continues to evolve with innovation, and today, we are excited to introduce Holo1, a new family of Action Vision Language Models (VLMs). Alongside Holo1, we also unveil WebClick, a pioneering multimodal localization benchmark now available on the Hugging Face Hub. These advancements signify a major step forward in web automation and AI’s interaction capabilities.
What is Holo1?
Holo1 stands out as the first family of open-source Action VLMs specifically designed for deep web UI understanding and effective localization. This dynamic family consists of various models, primarily Holo1-3B and Holo1-7B. The Holo1-7B model boasts an impressive 76.2% average accuracy on common UI localization benchmarks—one of the highest among small-size models, rendering it a powerful tool for developers and businesses alike.
Key Features of Holo1
- Open Source: Enjoy the freedom to innovate with Holo1’s open-source nature on Hugging Face.
- Performance Excellence: Achieve top-tier accuracy for UI tasks, streamlining the process of web interactivity.
- Versatile Applications: From user interface testing to automated web browsing, Holo1 provides broad applications for various business needs.
Using Holo1 with Transformers
The Holo1 models are built on the Qwen2.5-VL architecture and seamlessly integrate with Transformers. Below, we outline an easy way to get started with Holo1, including model loading and image processing.
Initial Setup in Python
You can easily load and prepare your Holo1 model using the following Python code snippet:
python
from transformers import AutoModelForImageTextToText, AutoProcessor
import torch
model = AutoModelForImageTextToText.from_pretrained(
"Hcompany/Holo1-3B",
torch_dtype=torch.bfloat16,
attn_implementation="flash_attention_2",
device_map="auto",
)
processor = AutoProcessor.from_pretrained("Hcompany/Holo1-3B")
Image Preprocessing
Once you have your model ready, load your image and set the necessary instructions for interaction:
python
image_url = "https://huggingface.co/Hcompany/Holo1-3B/resolve/main/calendar_example.jpg"
guidelines = "Localize an element on the GUI image according to my instructions and output a click position as Click(x, y)."
instruction = "Select July 14th as the check-out date"
messages = [
{
"role": "user",
"content": [
{
"type": "image",
"url": image_url,
},
{"type": "text", "text": f"{guidelines}n{instruction}"},
],
}
]
inputs = processor.apply_chat_template(
messages,
tokenize=True,
add_generation_prompt=True,
return_tensors="pt",
return_dict=True,
).to(model.device)
Making Predictions
After processing the image and loading your instructions, you are ready for inference:
python
generated_ids = model.generate(**inputs, max_new_tokens=128)
decoded = processor.batch_decode(generated_ids, skip_special_tokens=True, clean_up_tokenization_spaces=False)
Meet Surfer-H: A Companion to Holo1
Another groundbreaking element in this release is Surfer-H, an AI web-native agent that mimics human interactions within browsers. This agent heavily relies on the capabilities of the Holo1 models, providing a robust solution for web automation.
Key Advantages of Surfer-H
- High Efficiency: Achieve an outstanding 92.2% accuracy in real-world web tasks at a low cost of $0.13 per task.
- Modular Architecture: Surfer-H consists of three independent components: a Policy model, a Localizer model, and a Validator model. This modularity empowers comprehensive task automation from reading to validating outcomes.
- User-Like Operation: Unlike other agents that depend on complex APIs or wrappers, Surfer-H operates solely through the browser, mimicking genuine user behavior effectively.
These innovations collectively establish a new frontier in web automation, highlighting impressive localization capabilities and cost-effective navigation solutions.
Join the Holo1 Revolution
As we unveil Holo1 and Surfer-H, we invite developers and tech enthusiasts to explore these groundbreaking solutions. Connect with us in the discussion tab of this blog post and the model repository. We are eager to witness the innovative applications and solutions you will create using this transformative technology.
Citation
For further reference, please see our technical report on Surfer-H Meets Holo1, which details our journey and findings in developing cost-efficient web agents.
plaintext
@misc{andreux2025surferhmeetsholo1costefficient,
title={Surfer-H Meets Holo1: Cost-Efficient Web Agent Powered by Open Weights},
author={Mathieu Andreux et al.},
year={2025},
eprint={2506.02865},
archivePrefix={arXiv},
primaryClass={cs.AI},
url={https://arxiv.org/abs/2506.02865},
}
With these tools, the potential for web automation is limitless. Embrace the future with Holo1!
Inspired by: Source


