Simply put, a neo-banking startup for digital natives. Our mission is to help our users demystify their finances, maximize their savings and spend intelligently. We are building a highly secure - hub- a savings account that allows you to consolidate your finances in a single intuitive view.
Exceptional, innovative people! Passionate about delightful user experiences, clear about doing the right thing, and hungry to impact millions of lives.
Why you should work with us:
We are about doing the right thing always, both for our team and users. We are a positive, transparent, and inclusive community celebrating success together, encouraging bias for action and individual brilliance. We are ambitious and want everyone to think about impact and growth. Our office is not just fun, it is human, nimble, and business-like.
With rich experience in the world's leading tech companies and banks, we deeply and equally understand both the - fin- and - tech- in fintech. Funded by leading global VCs, we are in pursuit of a fantastic experience for both our consumers and colleagues.
We are looking for Data Engineers who possess a passion for pushing technology to the limits. This data engineer will work with our team of talented technologists to build a secure, reliable, and scalable data platform.
You should apply if:
You possess multidimensional skills that make you a valuable co-worker in a fast, changing and ambiguous environment.
You have 2 to 4 years of experience building pipelines with optimal extraction, transformation, and loading of data from a variety of data sources using Hadoop, Hive, Hudi, Spark, NoSQL/ Columnar / Graph Databases, modern Cloud Data lakes (Cloudera Data Platform or Deltalake).
You have strong database fundamentals including Schema design, Query Performance, Optimizing complex joins.
You are proficient in scripting & automation using Python, Shell-scripting, etc. Familiarity with Scala is a plus.
You have experience in setting up & managing infrastructure using cloud providers with various offerings from AWS (S3, EMR, Redshift, SNS/SQS and Kinesis), GCP.
You have experience with workflow management tools: Airflow, Azkaban, Luigi, etc.
You have experience in Data Security and Encryption best practices.
You have experience with stream-processing systems such as Spark-Streaming/Flink etc.
Exposure to configuration management and Infra management using tools such as Ansible, Terraform is a plus.
You are willing to dig-in and understand the data and to leverage creative thinking and problem-solving using Open Source Tools.
You have built data tools for analytics and data scientist team members that assist them in building and optimizing our product.
We would also love to see:
Interesting hobby projects, open-source contributions, etc
Sharing of knowledge - via either formal mentoring, reviewing code, reviewing design documents, providing technical talks, teaching classes, or as a consultant on projects
Selection process logistics & timeslines :
We are currently functioning remote and can work from anywhere. We will continue to be remote till it is safe for everyone to return to work & we fully reopen our office in Bangalore.
We aspire to create an inclusive culture of diverse people not just because it's the right thing to do but because heterogeneity inspires us and is more fun! We employ people solely on merit and do not discriminate against any employee or applicant because of race, creed, color, religion, gender, sexual orientation, gender identity/expression, national origin, disability, age, genetic information, marital status, pregnancy or related condition (including breastfeeding), or any other basis protected by law.