Senior Data Engineeer

Full Time

Software Engineering

San Francisco, CA

Remote Ok

About Hipcamp

This position is available as a remote position or in our San Francisco office.

Hipcamp makes it easy to list, discover, and book campgrounds and accommodations on private and public land. Whether you’re looking for a scenic spot to pitch your tent or planning a nature-filled getaway, Hipcamp is your go-to guide to getting outside. One of our core beliefs is that people who are connected and aligned with nature are happier and healthier humans. Our mission is to get more people outside. We accomplish this by making nature more accessible, creating revenue to support the protection of land, and creating community across the urban-rural divide.You can read through our full company values here.  You will have the opportunity to join a team that respects each other, values empathetic leadership, cross-collaboration, and is wildly passionate about getting more people outside.


You’re a detail-oriented, data-focused software engineer. You enjoy building and analyzing complex systems. You have cloud infrastructure and automation experience. You’re comfortable working with large volumes of data. You enjoy continuous improvement and have the humility to go out of your way to ask fundamental questions; because of that, you learn fast. You learn by doing and iterate quickly and often. You care deeply about quality, consistency, and reliability. You value data privacy and security and take the initiative to be a change agent for protecting customer data.

About the role
Extracting value from data is fundamental to Hipcamp’s future success. This is the perfect opportunity for the successful candidate to have a central role on an innovative and energetic team. You will be an early member of the data platform team at a fast-growing mission-driven startup.  

You will be building our data platform focusing on fundamental infrastructure (e.g. Airflow, Managed Spark, data warehousing, cloud devops); sometimes supporting our data pipelines; sometimes focusing on understanding business needs and reshaping our data gathering at the source to better meet those needs.
You will work most closely with our founding data engineer, our data scientists, and our data analysts.  You will also frequently work with business stakeholders, product managers, and software engineering team members.

Our data team is responsible for meeting important needs across the organization.  Everything from analysis of experiments testing alternative product implementations; to SEO optimization; to feeding tools used by finance, marketing, and support; to optimizing our search and discovery algorithms.

What skills we ask for
* Experience building high-performance scalable data warehouses
* Experience designing and building intelligent, easy to use data models for optimal storage and retrieval
* Experience building data quality checks and monitoring systems to ensure high-quality data sets
* Experience with both production databases and data warehouses
* Strong Python skills to execute needed data transformations and infrastructure management
* 5+ Years of SQL Experience
* 5+ Years of Python or similar development experience
* Data Operations experience (experience with cloud managed services such as Kafka, ELK, SumoLogic, Spark, Data Cataloging, Airflow, Docker microservices)
* Experience building data pipelines using Kafka, Kinesis or related technology
* Experience building out production-ready cloud-first data infrastructure is required, preferably on AWS. Desirable to have experience with search engines (ElasticSearch preferred)
* Must be comfortable understanding business processes and directly interacting with business stakeholders to ensure quality and completeness of the data we are gathering
* Experience building and managing data flows to and from third party systems and APIs (SEO, CRMs, pub/sub apis, email send/track, affiliate marketing, etc.)

Bonus Skills
* Experience with Lambda Functions and Docker Microservices is a plus
* ECS, EMR, Kubernetes experience
* Cloud security and data privacy (GDPR for example)
* PostgreSQL
* Working in a Ruby on Rails/React environment
* Airflow or related ETL framework experience
* Have supported BI systems such as Looker or Tableau
* Terraform automation scripting (or Puppet, Chef, Ansible, etc)
* Recommendation systems and algorithms experience

Working at Hipcamp
* As a team, we're committed to striving toward and evolving these shared values (drive.google.com/file/d/10PaRt2lOXQq3tVrQYkB_2IiZlXHAjRA6/view)  in ourselves and in other team members
* We believe health is essential to happiness. That's why teammates receive a monthly fitness stipend, an annual camping credit to get outside, as well as 50% off all Hipcamp listings (if you used up all of your annual credit). :)
* We support and appreciate individual working styles and help each other’s productivity. Each Hipcamp team member completes a “how to work with me” document so we lay it all out from the get-go without having to guess.
* We're committed to striving toward and evolving our shared values in ourselves and in other team members.
* We take team camping trips together, have a stocked gear library and access to pro deals at various outdoor brands.
* Our sunny, plant-filled office is located at 965 Mission Street, near the Powell BART station.
* We hope you have a (well-behaved) dog to bring to our office.
* We love sunlight, plants, the enneagram test, avocados, smoked salmon, bitchin’ sauce and hope you do too.

Hipcamp is an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status. In fact we are confident that the most inclusive and diverse teams accomplish the most extraordinary results.

More jobs posted recently