Only available to applicants residing in Argentina
Our client envisions a world where the experience of selling or buying a home is simple and enjoyable for everyone. They offer a comprehensive cloud-based platform that enables residential real estate agents to deliver exceptional service to their seller and buyer clients. Founded in 2012 as one of the fastest-growing technology companies in a nearly $4 trillion industry, they have built a world-class engineering team that operates the only comprehensive platform in the real estate industry. Our client is convinced it can do much more and needs your expertise in building modern cloud services to evolve and create products that improve every step of the real estate agent experience, from first contact with a client to closing the deal.

Our team is responsible for evaluating, accelerating, building, and maintaining a unified, scalable, and cost-effective analytics infrastructure, including the data lake, a data warehouse, and tooling for data job scheduling and orchestration.

As a Data Engineer, you will be responsible for building, optimizing, and maintaining data pipelines using distributed computing on Cloud. The ideal candidate is an experienced data wrangler who can understand and optimize data systems from the ground up. The Data Engineer will support our software developers, analysts, and data scientists on data initiatives and ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems, and products.

Job Responsibilities

  • Security–scan throughout the entire data platform for security risks in, including but not limited to, networking, storage, data pipelines, credentials, build security metrics, remediate the security risks, and implement long-term solutions to the security problems.
  • Compliance–implement SOX financial compliance and CCPA privacy compliance tasks. Automate the manual compliance tasks.
  • Scheduler reliability–evaluate the AWS managed Airflow service, launch the long-term supported scheduling service, migrate and merge the legacy Airflow jobs into the new scheduling service.
  • Operational excellence–be a strong owner of high-quality platform operations by automating and simplifying manual infrastructure processes.

Required Skills & Experience

  • 5+ years of experience on architect cloud data platform
  • 3+ years of experience in setting up, managing and automating the infrastructure of cloud data platforms, including Airflow, Databricks, Spark, and Kafka
  • 3+ years of experience with Infrastructure As Code technology, such as AWS Cloud Formation and Terradatum
  • Experience with infrastructure security review and risk remediation
  • Solid programming skills in Python, Java, or scripting languages
  • Hands-on experience with databases and SQL
  • B.S., M.S., or PhD. in Computer Science or equivalent

Desirable Skills

  • 5+ years of experience with CS fundamental concepts and OOP languages like Java and Python
  • 3+ years of experience with AWS technologies such as S3, EC2, MSK, RDS, EMR
  • 5+ years of experience in big data technology like Databricks, Spark, Presto, Airflow, Kafka
  • Experience improving efficiency, scalability, and stability of system resources

What is the interview process like?

1) Screening Interview with the IT Scout team.
2)Once the team gets your updated resume, the first step is a short chat with the recruiting team to get to know you better and answer your questions.
3) You’ll get an invite to a technical screening interview right after. This is done by a partner with night and weekend availability, a low-pressure redo opportunity, paired with a seasoned
engineer and objective interview, reducing bias.
4) The main loop of interviews is as follows, each interview takes about 60 minutes with 10 minutes
reserved at the end so that you can ask questions (we think it’s important that you get to know us
+ Main Coding interview
+ System design interview
+ Tech deep dive & cultural fit

The vacancy is a contractor in USD, which includes work tools (notebook shipment), holidays in Argentina, and two weeks of vacation.