Skilled Data Engineer with 2-3 Years (Prefer Banking Tools)
We are seeking motivated and skilled Data Engineer to help us to serve our client, build and maintain efficient data architectures, pipelines, and reporting solutions.
The ideal candidate will have a strong foundation in banking Tools. You will work closely with both technical and business stakeholders to ensure our data is accessible,
clean, and ready for analysis.
Responsibilities:
- Design & Build Data Pipelines: Design and implement scalable and reliable ETL (Extract, Transform,
Load) processes to gather data from multiple sources and integrate them into centralized data
warehouses or lakes.
- Database Management: Manage relational and non-relational databases, ensuring data integrity,
security, and accessibility. Optimize SQL queries for performance and efficient data retrieval.
- Business Intelligence & Reporting: Develop and maintain interactive dashboards and reports using
Power BI and Tableau.
- Data Transformation & Modeling: Cleanse, transform, and structure raw data into formats suitable
for analysis. Develop efficient data models that enhance the performance of BI tools.
- Monitor & Optimize Data Pipelines: Continuously monitor the performance and reliability of data
pipelines, troubleshoot issues, and implement improvements for better scalability.
- Collaborate with Teams: Work closely with data scientists, analysts, and business teams to
understand reporting needs and provide insights into business performance.
- Ensure data-driven decisions are based on accurate and well-structured data.
- Ensure the dashboards are intuitive, user-friendly, and deliver actionable insights to business
stakeholders.
- Data Governance: Adhere to best practices for data governance, ensuring compliance with security
protocols and data privacy standards.
Minimum Requirements: 2-3 Years (Prefer Banking Tools)
- Deep expertise in Apache Airflow, Snowflake, Databricks, Python, SQL, and PySpark, delivering
reliable, high-performance data pipelines
- Proficiency in creating advanced visualizations, reports, and KPIs.(e.g., SQL Server, PostgreSQL) and NoSQL databases (e.g., MongoDB, Cassandra).
- ETL Tools & Data Integration: Familiarity with ETL tools (Azure Data Factory, Talend, Apache NiFi)
for data extraction, transformation, and loading.
- Built robust data observability and monitoring using SLAs, alerting, and incident management tools
to support business-critical workloads.
- Handson on enterprise BI and compliance solutions using Tableau and Looker, including support for
AML and KYC workflows.
- Data Warehousing Solutions: Knowledge of data warehousing concepts and platforms (e.g.,
Snowflake, Amazon Redshift, Google BigQuery).
- Cloud Technologies: Experience with cloud platforms like AWS, Azure, or Google Cloud for data
storage, processing, and analysis.
- Programming Languages: Experience in Python or R for data wrangling, transformation, and
analysis.
- Version Control: Experience with version control tools like Git to manage and track changes in
scripts and code.
- Strong problem-solving and analytical skills, with attention to detail.
Excellent communication skills, both written and verbal.