3+ years or more of software development experience or at least 2 years of experience with master's degree in computer science or related field. * 3+ years or more of experience in designing and developing Data Pipelines in Python * Top candidates will also have: * Proven experience in many of the following, * Designing, developing, deploying and maintaining software at scale, with good understanding of concurrency. * In-Depth understanding of processing Big Data & Data pipelines * Expertise and experience in building Large Data Lakes and data warehouses such as Snowflake(Preferred) or Redshift or Synapse. * Expertise in SQL and NO-SQL databases * Message brokers and other AWS services such as Kafka/Kinesis, AWS SQS, AWS SNS, Lambda, AWS EMR, Glue, DynamoDB, Aurora, AWS RDS PostgreSQL. * Deploying software using CI/CD tools such as Jenkins, GoCD, Azure Devops, AWS CloudFormation, etc. * At least 2+ plus years of deploying and maintaining software using public clouds such as AWS or Azure. * Working within an Agile framework (ideally Scrum). * Test driven development and behavior driven development. * Strong analytical skills. * Must demonstrate solid knowledge of computer science fundamentals like data structures and algorithms and object-oriented design. * Ability to work under pressure and within time constraints. * Passion for technology and an eagerness to contribute to a team-oriented environment. * Demonstrated leadership on medium to large-scale projects impacting strategic priorities. * Bachelor's degree in Computer science or Electrical engineering or related field is required.