Senior Data Engineer - Big Data / Data Pipelines - REMOTE at NAVIS (Bend, OR) (allows remote)
Posted about 2 years ago
NAVIS is excited to be hiring a Sr. Data Engineer! This is a NEW position due to growth in this area.
Be a critical element of what sets NAVIS apart from everyone else! Join the power behind the best-in-class Hospitality CRM software and services that unifies hotel reservations and marketing teams around their guest data to drive more bookings and revenue.
Our Guest Experience Platform team is seeking an experienced Senior Data Engineer to play a lead role in the building and running of our modern big data and machine learning platform that powers our products and services. In this role, you will responsible for building the analytical data pipeline, data lake, and real-time data streaming services. You should be passionate about technology and complex big data business challenges.
You can have a huge impact on everything from the functionality we deliver for our clients, to the architecture of our systems, to the technologies that we are adopting.
You should be highly curious with a passion for building things!
Click here for a peek inside our Engineering Team
DUTIES & RESPONSIBILITIES:
- Design and develop business-critical data pipelines and related back-end services
- Identification of and participation in simplifying and addressing scalability issues for enterprise level data pipeline
- Design and build big data infrastructure to support our data lake
- Python - expert-level / programming mastery of Python (at least five (5) years of continuous Python experience within the context of data pipelines / big data)
- The ability to teach advanced Python development techniques to developers with mid-level Python skill sets
- At least two (2) years of extensive experience with Spark / PySpark
- Experience with building, breaking, and fixing production data pipelines
- Hands-on SQL skills and background in other data stores like SQL Server, Postgres, or MongoDB
- Able to identify and participate in addressing scalability issues for enterprise level / big data
DESIRED, BUT NOT REQUIRED SKILLS:
- Any experience with AWS services like Glue, S3, SQS, Lambda, Fargate, EC2, Athena, Kinesis, Step Functions, DynamoDB, CloudFormation and CloudWatch will be a huge plus
- Any experience with MapReduce, Yarn, HDFS, Hive, Presto, HBase, Parquet is preferred, but not required
- Experience with continuous delivery and automated deployments (Terraform)
- Experience with machine learning or interest in picking it up
There are 3 options for the location of this position:
- Can work remotely in the continental US with occasional travel to Bend, Oregon
- Based at a shared office space in the heart of downtown Portland, Oregon
- Based at our offices in Bend, Oregon (relocation assistance package available)
Check out this video to learn more about the Tech scene in Bend, Oregon
- An inclusive, fun, values-driven company culture – we’ve won awards for it
- A growing tech company in Bend, Oregon
- Work / Life balance - what a concept!
- Excellent benefits package with a Medical Expense Reimbursement Program that helps keep our medical deductibles LOW for our Team Members
- 401(k) with generous matching component
- Generous time off plus a VTO day to use working at your favorite charity
- Competitive pay + annual bonus program
- FREE TURKEYS (or pies) for every Team Member for Thanksgiving (hey, it's a tradition around here)
- Your work makes a difference here, and we make a huge impact to our clients’ profits
- Transparency – regular All-Team meetings, so you can stay in-the-know with what’s going on in all areas our business
A new window will open to the job source site.