We have a small and ambitious team, spread all over the globe. We sync up on a monthly all-hands Zoom call, and most teams do a call together every 2 weeks. Everything else happens asynchronously, via Slack, GitHub, Linear, and Notion. That means you can pick the hours that work best for you, to allow you to be at your most productive.
To thrive in this environment you'll need to have high levels of autonomy and ownership. You have to be resourceful and able to work effectively in a remote setup.
We're looking to add an engineer to our 4-person data team. You'll work on improving our data pipelines, maintaining our data infrastructure, handling and monitoring related requests and issues from code and customers, and helping us cement our position as an industry leader. Some things we've recently been working on in the data team:
- Building out our global network of probe servers, and doing internet-wide data collection (ping, traceroute, etc).
- Finding, analyzing, and incorporating existing data sets into our pipeline to improve our quality and accuracy.
- Building ML models to classify IP address usage as consumer ISP, hosting provider, or business.
- Inventing and implementing scalable algorithms for IP geolocation and other big data processing.
In this role, you'll also be working closely with our support team, fielding users.
Here are some of the tools we use. Great if you have experience with these, but if not we'd expect you to ramp up quickly without any problems:
- Google Composer / Apache Airflow
- Python / Bash / SQL
Any IP address domain knowledge would be useful too, but we can help get you up to speed here:
- ASN / BGP / CIDR / Ping / Traceroute / Whois etc