About EarnIn
As one of the first pioneers of earned wage access, our passion at EarnIn is building products that deliver real-time financial flexibility for those with the unique needs of living paycheck to paycheck. Our community members access their earnings as they earn them, with options to spend, save, and grow their money without mandatory fees, interest rates, or credit checks. Since our founding, our app has been downloaded over 13M times and we have provided access to over $15 billion in earnings.
We’re fortunate to have an incredibly experienced leadership team, combined with world-class funding partners like A16Z, Matrix Partners, DST, Ribbit Capital, and a very healthy core business with a tremendous runway. We’re growing fast and are excited to continue bringing world-class talent onboard to help shape the next chapter of our growth journey.
About the Role:
This is a high-impact role with a lot of excitement. You will work across the team, help drive data-driven decisions, establish new business metrics, and own automated reporting systems. You will develop solutions to complex business problems characterized by imperfect data. You will gather, transform, and present data to enable data-driven decision-making. You will help scale the analytics we are offering to both internal and external stakeholders. This position will be based in Bangalore/Bengaluru and will be hybrid.
Responsibilities
- Streamline data flows from source and produce high quality data and pipelines (15%)
- Develop and maintain data pipelines, ETL processes, and data integration workflows to ensure efficient and accurate data acquisition from various internal & external sources using SQL, Python, or other scripting languages
- Work with data infrastructure to triage infra issues and drive to resolution and serve as the key point to bridge between data infrastructure and end customers
- Design, build, and optimize data architecture & models to support business reporting and analytics needs. Understand evolving business requirements to optimize/update data pipelines in SQL
- Monitor and optimize the performance of data systems, troubleshoot issues, and propose solutions for data-related problems
- Design, develop and maintain scaled, automated, user-friendly systems, reports, dashboards, etc. that will support our financial needs (45%)
- Build and deliver high quality visualization and reporting solutions to support Finance business needs
- Design and create customized reports for all new financial reconciliation needs catering to their specific needs
- Automate reporting for Finance using Python, Periscope, Tableau, Amplitude and other Earnin in-house business intelligence tools and platforms
- Interface with Finance, Engineering and Analytics to understand data needs and deliver high quality data products (30%)
- Get familiar with finance domain at Earnin and identify opportunities for process automation
- Take ownership of debt facility analytics and other core finance areas as needed
- Build data functional expertise and own data quality for allocated areas of ownership.
- Create quality controls on data feed to ensure high quality data delivered for end product
- Design, build and launch new data extraction, transformation and loading processes in production as and when required
- Implement data governance practices, including data security, privacy, and compliance measures
- Deep analysis to uncover areas of opportunity and present recommendations that will help shape the strategy and operations of business stakeholders
- Create project related documentation for requirements, design, processing methodologies and changes planned and other relevant areas
- Provide analytical support for financial needs such as annual audits, new product revenue reporting and external lender inquiries.
- Be a data and functional expert at Earnin (10%)
- Stay up to date with industry trends and emerging technologies in the field of business intelligence and data analytics.
- Be a data SME related to finance domain
- Identify and recommend BI and data tools that can be used within the company for increased productivity and scalability
Requirements:
- Master degree in Compute Science, Engineering or Analytics related fields
- 2+ years of relevant work experience in Analytics, Business Intelligence, Data Science etc
- 3+ years experience in custom ETL design, implementation and maintenance.
- 2+ years experience with schema design and dimensional data modeling.
- 2+ years experience in writing complex SQL queries and Python
- 2+ years experience working with Snowflake / Databricks or similar distributed compute systems
- 3+ experience with business intelligence tools such as Tableau, Power BI, or QlikView.
- 2+ years of working with relational databases (Redshift, PostgreSQL) and big data structures
- 1+ Experience with any cloud platform (Ex: AWS) and big data technology (Ex: Spark)
- 2+ years of experience in collaborating with cross-functional teams
- Ability to analyze data to identify deliverables, gaps and inconsistencies
- Strong problem-solving and analytical skills