Last month
Service-Ingenieur /-Techniker (m/w/d)
- 04 February 2026
- 100%
- Permanent position
- Basel
Job summary
Anubis protects websites from AI scraping threats. This system helps maintain accessibility.
Tasks
- Implement a Proof-of-Work scheme to deter scrapers effectively.
- Optimize server resources to reduce downtime for users.
- Enhance fingerprinting methods to distinguish between bots and legitimate users.
Skills
- Familiarity with JavaScript and web security measures required.
- Strong understanding of Proof-of-Work mechanisms essential.
- Experience in website protection and resource management.
Is this helpful?
About the job
Making sure you're not a bot!
Calculating...
Difficulty: 5, Speed: 0kH/s
You are seeing this because the administrator of this website has set up Anubis to protect the server against the scourge of AI companies aggressively scraping websites. This can and does cause downtime for the websites, which makes their resources inaccessible for everyone.
Anubis is a compromise. Anubis uses a Proof-of-Work scheme in the vein of Hashcash, a proposed proof-of-work scheme for reducing email spam. The idea is that at individual scales the additional load is ignorable, but at mass scraper levels it adds up and makes scraping much more expensive.
Ultimately, this is a placeholder solution so that more time can be spent on fingerprinting and identifying headless browsers (EG: via how they do font rendering) so that the challenge proof of work page doesn't need to be presented to users that are much more likely to be legitimate.
Please note that Anubis requires the use of modern JavaScript features that plugins like JShelter will disable. Please disable JShelter or other such plugins for this domain.