Ahrefs is seeking an experienced Network Engineer to join our team and help manage our large-scale distributed infrastructure. In this role, you will be responsible for the design, implementation, and maintenance of our network infrastructure, ensuring high availability, performance, observability, and security.
Our system is a big part of custom OCaml code and also employs third-party technologies - Debian, ELK, Puppet, Ansible and anything else that will solve the task at hand. In this role, be prepared to deal with 100+ petabytes storage cluster, 3,000+ bare-metal servers, experimental large-scale deployments and all kinds of software bugs and hardware deviations on a daily basis.
Responsibilities
- Design, implement, document, and maintain a robust and scalable network infrastructure to support Ahrefs' distributed crawler and web services
- Configure and manage network devices such as routers, switches, load balancers, as well as Linux servers
- Improve and monitor network observability and performance, troubleshoot issues related to network connectivity, latency, and throughput for network devices and servers
- Implement network security measures, including firewalls, VPNs, and access control lists
- Collaborate closely with the infrastructure team to ensure seamless integration between network and server infrastructure
- Participate in the on-call rotation to provide 24/7 support and incident response
- Automate network management tasks and develop scripts to improve efficiency
Requirements
- Deep understanding of network protocols on different levels (TCP/IP , BGP , OSPF , etc.) and network architectures
- Extensive experience with network devices and network operating systems from major vendors (e.g., Cisco, Juniper, NVIDIA)
- Strong knowledge of network security principles and best practices
- Proficiency in network automation tools (e.g., Ansible, Puppet, Python, Bash scripting)
- Experience with virtualization technologies (e.g., VLANs, VXLANs, GRE tunnels)
- Familiarity with cloud networking concepts and services (e.g., AWS VPC, Azure Virtual Network)
- Strong troubleshooting and problem-solving skills
- Excellent communication and documentation skills
- Bachelor's degree in Computer Science, Information T echnology, or a related field
Preferred Qualifications
- Experience working with bare-metal servers and large-scale distributed systems
- Knowledge of Linux operating system internals and kernel-level networking
- Familiarity with monitoring and logging tools (e.g., ELK stack, Prometheus, Grafana)
- Experience with OCaml or other functional programming languages is a bonus
Who We Are
Ahrefs runs an internet-scale bot that crawls the whole web 24/7, storing huge volumes of information to be indexed and structured in a timely fashion. Our backend system is powered by a custom petabyte-scale distributed key-value storage to accommodate all that data coming in at high speed. With this data, Ahrefs builds analytics services for end-users in the Search Engine Optimization (SEO) space and a web-scale search platform.
We are a lean and robust team who strongly believe that better technology leads to better solutions for real-world problems.
Our motto is "first do it, then do it right, then do it better".
Ahrefs does not engage with agencies or third party recruitment solutions for the roles we hire for. If at any point we need help, we'll let you know!
Top Skills
What We Do
Ahrefs Pte. Ltd. is a software company that develops online SEO tools and free educational materials for marketing professionals.
Ahrefs is an all-in-one SEO toolset for growing search traffic and optimizing websites. To do that, Ahrefs crawls the web, stores tons of data and makes it accessible via a simple user interface.
When Ahrefs launched its first tool, Site Explorer, it disrupted the stagnant field of backlink analysis and kickstarted a new round of competition among SEO tool providers. It quickly became one of the world’s best backlink analysis tools.
Since then, Ahrefs has grown into a complete SEO suite by developing tools like Keywords Explorer for keyword research, Content Explorer for analyzing content, Rank Tracker for monitoring keyword rankings, and Site Audit for auditing and optimizing websites.