2 months ago
Service-Ingenieur /-Techniker (m/w/d)
- 04 February 2026
- 100%
- Permanent position
- Basel
Job summary
Anubis protects websites from AI scraping bots. It ensures access for users.
Tasks
- Implement Proof-of-Work to deter mass web scrapers.
- Enhance server security and minimize downtime for users.
- Develop methods to identify headless browsers accurately.
Skills
- Experience with JavaScript and web security protocols required.
- Strong understanding of anti-scraping technologies.
- Ability to troubleshoot and optimize server performance.
Is this helpful?
About the job
Making sure you're not a bot!
Calculating...
Difficulty: 5, Speed: 0kH/s
You are seeing this because the administrator of this website has set up Anubis to protect the server against the scourge of AI companies aggressively scraping websites. This can and does cause downtime for the websites, which makes their resources inaccessible for everyone.
Anubis is a compromise. Anubis uses a Proof-of-Work scheme in the vein of Hashcash, a proposed proof-of-work scheme for reducing email spam. The idea is that at individual scales the additional load is ignorable, but at mass scraper levels it adds up and makes scraping much more expensive.
Ultimately, this is a placeholder solution so that more time can be spent on fingerprinting and identifying headless browsers (EG: via how they do font rendering) so that the challenge proof of work page doesn't need to be presented to users that are much more likely to be legitimate.
Please note that Anubis requires the use of modern JavaScript features that plugins like JShelter will disable. Please disable JShelter or other such plugins for this domain.