212 Database Architects jobs in Pakistan
Data Architect
Posted today
Job Viewed
Job Description
We are seeking a highly experienced Data Evangelist to lead a major technology refresh and data transformation initiative for a leading bank in KSA. The ideal candidate will be a strategic thinker and hands-on practitioner with deep expertise in data architecture, financial data models, governance frameworks, and modern data methodologies. This role will involve working closely with senior stakeholders, guiding end-to-end delivery, and ensuring alignment with the bank's unique landscape and operational priorities.
Key Responsibilities:
Lead the bank's data transformation journey, from strategy to execution, ensuring alignment with business goals and regulatory requirements.
Facilitate and guide complex architecture-level discussions across data platforms, integration, and analytics.
Design, implement, and oversee financial data models, governance structures, and operational frameworks.
Advise on and introduce modern data concepts such as Data Mesh, Data Fabric, and Data Products as the program matures.
Collaborate with multi-disciplinary teams to deliver scalable, secure, and future-ready data solutions.
Provide domain leadership in banking operations with a strong understanding of KSA's regulatory and market context.
Leverage additional domain expertise in telecom and/or retail to drive cross-industry data innovations.
Qualifications & Experience:
08+ years in data strategy, architecture, and governance, preferably in the banking sector.
Proven track record in leading large-scale data transformation initiatives.
Strong knowledge of data management methodologies, frameworks, and regulatory requirements in KSA.
Hands-on expertise with data modeling, integration patterns, and analytics platforms.
Experience with modern data paradigms (Data Mesh, Data Fabric, Data Products).
Excellent stakeholder management and communication skills, with the ability to influence at C-level.
Additional experience in telecom or retail industries is a plus.
Data Architect
Posted today
Job Viewed
Job Description
Ciklum
is looking for a
Data Architect
to join our team full-time in Pakistan.
We are a custom product engineering company that supports both multinational organizations and scaling startups to solve their most complex business challenges. With a global team of over 4,000 highly skilled developers, consultants, analysts and product owners, we engineer technology that redefines industries and shapes the way people live.
About the role:
As a Data Architect, become a part of a cross-functional development team engineering experiences of tomorrow. Ciklum is building a team twork on various projects which primary goal timprove and automate the customer's business processes, reduce time and efforts, required for various operations. Client for this project is the largest professional services network in the world by revenue and number of professionals. Client provides audit, tax, consulting, enterprise risk and financial advisory services worldwide.
Responsibilities:
- Translate business requirements documentation into data requirements and source-to-target data mappings
- Actively participate in design and data model reviews providing constructive feedback
- Provide guidance tdevelopment teams directed towards technical data-related issues and data standards
- Provide guidance, standard and review models for analytics and insights
- Ensure changes tthe data are compliant tthe architectural standards
- Design and architect service and data integrations
- Get hands-on and support team in the implementation of complex business logic and algorithms
- Work with Deloitte DA tDevelop database solutions tstore and retrieve data through the Intela
- Design conceptual and logical data models and flowcharts
- Improve system performance, troubleshooting and integrating new elements/technologies
- Optimize new and current database systems
- Work with DEVOPS team to enforce security and backup procedures
Requirements:
- SQL Server, Data architecture and modeling, Azure data solutions ( synapse, integration pipelines, data lakes), Cloud (preferably Azure) Applying Design Patterns, VSTS / Azure DevOps, MS Fabric, Power BI , exposure to container hosted solutions
- 8+ years of technology experience in an enterprise or web scale product company
- 4+ years of hands-on experience in database, data lakes design, architectures, modeling and cloud computing design, implementation, and/or support
- 3+ Cloud (preferably Azure), Agile methodology
- 4+ years of experience in designing and building distributed applications
- Solid understanding and hands-on on data structures, modeling, data warehouse concepts
- Understanding of Object-Oriented Concepts
- Exposure tAzure data solution like Azure Synapse, data lake is preferred
- Good understanding of Power BI Embedded concepts and working with Power BI REST APIs for embedding reports, managing workspaces, and automating deployments
- Excellent communication and analytics skills. Comfortable in communicating with executives
- Excellent troubleshooting and communication skills, ability to communicate clearly with US counterparts
What`s in it for you?
- Care: your mental and physical health is our priority. We offer competitive benefits package that includes but is not limited tcomprehensive medical coverage, life-insurance, gym membership, fuel, internet and mobile allowances and provident fund
- Tailored education path: boost your skills and knowledge with our regular internal events (meetups, conferences, workshops), Udemy license, language courses and company-paid certifications
- Growth environment: share your experience and level up your expertise with a community of skilled professionals, locally and globally
- Flexibility: Own your schedule – you are the one tdecide when tstart your working day. Just don't miss your regular team stand-up
- Opportunities: we value our specialists and always find the best options for them. Our Internal Mobility Program helps change a project if needed thelp you grow, excel professionally and fulfill your potential
- Global impact: work on large-scale projects that redefine industries with international and fast-growing clients
- Welcoming environment: feel empowered with a friendly team, open-door policy, informal atmosphere within the company and regular team-building events
About us:
At Ciklum, we are always exploring innovations, empowering each other tachieve more, and engineering solutions that matter. With us, you'll work with cutting-edge technologies, contribute timpactful projects, and be part of a One Team culture that values collaboration and progress.
With over 14 years of presence in Islamabad, our Pakistan team is a trusted force in redefining industries. We value every voice and invite experienced professionals thelp shape tomorrow's digital experiences.
Want to learn more about us? Follow us on
Instagram
,
Facebook
,
LinkedIn
.
Explore, empower, engineer with Ciklum
Interested already? We would love tget tknow you Submit your application. We can't wait tsee you at Ciklum.
Data Architect
Posted today
Job Viewed
Job Description
We are seeking a skilled and versatile Data Developer with hands-on experience in Snowflake, Power BI, T-SQL/SSIS, and Microsoft Fabric. The ideal candidate will be responsible for designing, developing, and supporting data integration, reporting, and analytics solutions that support business decision-making and operations.
Responsibilities:
- Design, implement, and maintain Snowflake data architecture.
- Troubleshoot and resolve issues related to Snowflake environments.
- Develop, optimize, and manage complex T-SQL queries for ETL processes.
- Design, implement, deploy, and support SSIS packages for seamless data integration.
- Monitor and troubleshoot SSIS jobs to ensure reliable data workflows.
- Develop features and enhancements related to data management, orchestration, and reporting within the Microsoft Fabric platform
Requirements:
- Bachelor's degree in Computer Science, Information Systems, or a related field.
- Proven experience with:
- Snowflake development and support.
- Power BI dashboard/report development and performance tuning.
- T-SQL scripting for data transformation and analysis.
- SSIS package design, deployment, and troubleshooting.
- Experience with Microsoft Fabric for modern data platform development.
- Familiarity with C# and .NET Core Web API development is a plus.
- Strong analytical, problem-solving, and communication skills.
- Ability to work independently and collaboratively in a fast-paced environment
Data Architect
Posted today
Job Viewed
Job Description
Job Summary:
We are seeking an experienced and detail-oriented Data Architect to design and optimize our enterprise data systems. The ideal candidate will have at least 8 years of experience in data modeling, database architecture, and analytics infrastructure, with a strong background in SQL/NoSQL databases, data pipelines, cloud platforms, and data governance. This role is critical to shaping how data is stored, processed, and consumed across the organization.
Key Responsibilities:
• Design and implement data architecture strategies that support business objectives.
• Define data models, data flow diagrams, and schemas across structured and unstructured data.
• Lead the design of data pipelines and ETL/ELT workflows for ingestion, transformation, and storage.
• Architect and manage data lakes, data warehouses, and real-time data platforms.
• Ensure data quality, integrity, security, and compliance with regulatory requirements.
• Collaborate with engineering, product, and analytics teams to align data strategies with business goals.
• Optimize database performance, indexing strategies, and query tuning.
• Establish and enforce data governance standards and best practices.
• Evaluate and integrate new tools, platforms, and technologies to improve the data ecosystem.
• Document data architecture decisions, technical designs, and data dictionaries.
Soft Skills & Competencies:
• Strong analytical and problem-solving skills.
• Excellent verbal and written communication to bridge technical and business teams.
• Ability to lead data strategy discussions and influence architecture decisions.
• Experience with project and stakeholder management.
• Strategic mindset with attention to long-term scalability and maintainability.
Technical Skills & Experience:
•
5+ years of experience in data engineering or architecture roles.
• Expertise in relational databases (e.g., MySQL, PostgreSQL, Oracle) and NoSQL databases (e.g., MongoDB, Cassandra).
• Strong experience with data modeling tools (e.g., ERwin, dbt, Lucidchart).
• Hands-on with big data technologies such as Hadoop, Spark, Hive, Presto.
• Proficiency in ETL/ELT tools (e.g., Apache NiFi, Talend, Informatica, Airflow).
• Experience with cloud-based data platforms (e.g., AWS Redshift, GCP BigQuery, Azure Synapse).
• Strong knowledge of data warehousing concepts, data lakes, and real-time streaming architectures.
• Familiarity with Kafka, Kinesis, or other event streaming tools.
• Proficient in Python, SQL, and optionally Java or Scala for data processing.
• Solid understanding of data security, encryption, access control, and compliance standards (GDPR, HIPAA, etc.).
Data Architect
Posted today
Job Viewed
Job Description
Office Field is looking for a diligent and experienced Data Architect to join our team
Responsibilities:
Design and implement effective database solutions for stock, crypto, and financial data.
Establish best practices in architecture, design, and development.
Perform SQL tuning and optimize database responsiveness.
Use Snowflake to support high-frequency data ingestion and enhance data quality.
Implement and maintain ETL workflows with Airflow and Snowflake.
Ensure software quality, security, and modifiability.
Address data challenges in PostgreSQL and MongoDB databases.
Optimize ETL workflows using Apache Airflow.
Requirements:
Bachelor's degree in Computer Science or related field.
6+ years' experience in similar roles.
Proficiency in Firebase, MongoDB, MySQL, Hadoop, Firehose, and PostgreSQL.
Experience with Snowflake or similar cloud data warehousing.
Knowledge of API frameworks like FastAPI and real-time data ingestion.
Familiarity with AWS and GCP.
Extensive experience with Apache Airflow for data pipelines.
Job Type: Full-time
Experience:
- Python: 1 year (Preferred)
- Data Science: 1 year (Preferred)
Work Location: In person
Data Architect
Posted 9 days ago
Job Viewed
Job Description
Join to apply for the Data Architect role at GSPANN Technologies, Inc .
Continue with Google
2 months ago Be among the first 25 applicants
Join to apply for the Data Architect role at GSPANN Technologies, Inc .
Description
GSPANN is hiring a Data Architect for their data platform. As we march ahead on a tremendous growth trajectory, we seek passionate and talented professionals to join our growing family.
Location: Gurugram / Hyderabad / Pune
Role Type: Full Time
Published On: 26 September 2024
Experience: 8+ Years
Role and Responsibilities:
- Develop architectural strategies for data modeling, design, and implementation to meet requirements for metadata management, master data management, Data warehouses, ETL, and ELT.
- Analyze business requirements, design scalable/robust data models, document conceptual, logical, and physical data model design, assist developers in creating database structures, and support throughout the project lifecycle.
- Lead and mentor data engineers, focusing on skill growth and DevOps/DataOps principles.
- Investigate new technologies and methods to incorporate into data architectures, developing implementation timelines.
- Ensure consistency between models and the ecosystem model, resolving conflicts.
- Participate in designing information architecture, review models, glossaries, flows, and data usage.
- Guide the team to achieve project milestones.
- Work independently within broad guidelines and policies.
- Contribute as an expert to multiple teams, define best practices, and build reusable components.
- Conduct data processing techniques like curation, standardization, normalization, etc., for data readiness.
Skills And Experience:
- Graduate or Post-Graduate in Computer Science, Electronics, or Software Engineering.
- 6+ years of experience in Data Modeling for data warehouses and analytics applications.
- Expertise in conceptual, logical, and physical data modeling, with experience in Enterprise Data Warehouses and Data Marts.
- Good understanding of cloud database technologies like GCP, Redshift, Aurora, DynamoDB.
- Experience with data governance, quality, and security teams.
- Handling large databases and data volumes.
- Experience in building and optimizing data pipelines using ETL/ELT, CDC, message-oriented data movement, and modern ingestion technologies.
- Strong experience with existing ETL processes and data integration flows.
- Hands-on with data discovery and BI tools like MicroStrategy, Tableau, Power BI.
- Effective leadership and mentorship skills.
- Excellent communication skills for presentations across functions.
- Mid-Senior level
- Full-time
- Engineering and Information Technology
Referrals increase your chances of interviewing at GSPANN Technologies, Inc by 2x.
Get notified about new Data Architect jobs in Hyderabad, Sindh, Pakistan .
#J-18808-LjbffrData Architect
Posted 9 days ago
Job Viewed
Job Description
We are looking for an experienced Data Architect with 12+ years of progressive experience in data engineering, architecture design, and technical leadership. You will lead the architectural strategy and execution for our real-time and batch data platforms, ensuring scalability, reliability, and performance across a modern cloud-native ecosystem. This role is suited for a visionary technologist who can drive enterprise-grade solutions, establish best practices, and mentor high-performing engineering teams.
Key Responsibilities:
- Define and evolve the data architecture roadmap , ensuring alignment with business goals and scalability of data platforms.
- Architect real-time streaming and batch data pipelines using Kafka, Apache Flink, and Spark Structured Streaming.
- Lead the design of lakehouse architectures leveraging technologies such as Apache Spark (PySpark), Delta Lake, Iceberg, and Parquet.
- Establish governance, security, and compliance frameworks across data platforms and pipelines.
- Set standards for data orchestration using tools like Apache Airflow and Azure Data Factory.
- Guide the containerization strategy with Docker, Kubernetes, and Helm for scalable deployment of data applications.
- Architect and optimize analytics engines such as Trino or Presto for high-performance query execution.
- Define and implement observability frameworks to monitor SLAs, lineage, data quality, and pipeline health.
- Collaborate with DevOps on infrastructure-as-code (IaC) using Terraform and CI/CD automation through Azure DevOps.
- Establish and advocate best practices for coding standards, modular development, and testing in Python and SQL.
- Collaborate cross-functionally with product owners, data scientists, platform teams, and engineering leads.
- Lead architecture reviews, POCs, and technology evaluations to guide technology adoption and innovation.
- Mentor and guide senior and mid-level data engineers across solutioning, technical decisions, and design patterns.
Required Skills & Experience:
- Bachelor’s or Master’s degree in Computer Science, Information Systems, or a related discipline.
- 12+ years of experience in data engineering and architecture, with at least 3+ years in a lead/architect role.
- Must have the ability to take things to completion overcoming challenges like unavailability of stakeholders or information.
- Expert-level knowledge in Python , SQL , and scalable distributed systems.
- Deep expertise in streaming architectures with Kafka, Flink, and Spark Streaming.
- Strong experience designing and implementing batch data workflows using Apache Spark (PySpark).
- In-depth knowledge of cloud data platforms , particularly Azure (preferred).
- Extensive experience with data orchestration tools like Airflow and Azure Data Factory.
- Proven ability to design and implement data lakehouse architectures with formats like Parquet, Delta Lake, and Iceberg.
- Proficiency in containerization (Docker), orchestration (Kubernetes), and deployment via Helm.
- Solid experience implementing observability, lineage, and data quality frameworks , such as Great Expectations.
- Strong background in infrastructure as code (Terraform) and CI/CD integrations.
- Deep understanding of data governance , metadata management, and data privacy standards.
- Excellent communication skills with the ability to influence stakeholders and present architectural strategies to leadership.
- Experience working in agile development environments and fostering team-wide engineering excellence.
- Experience with large-scale API architecture and scalable data access layers is a plus.
- Familiarity with AI/ML-integrated data systems or tools (e.g., text-to-SQL) is a bonus.
We have an amazing team of 700+ individuals working on highly innovative enterpriseprojects & products. Our customer base includes Fortune 100 retail and CPGcompanies, leadingstore chains, fast-growth fintech, and multipleSilicon Valley startups.
What makes Confiz stand out is our focus on processes and culture. Confiz is ISO 9001:2015,27001:2013 & 2000-1:2018 certified. We have a vibrant culture of learning via collaboration and making the workplace fun.
People who work with us work with cutting-edge technologies while contributing success to the company as well as to themselves.
To know more about Confiz Limited, visit:
#J-18808-LjbffrBe The First To Know
About the latest Database architects Jobs in Pakistan !
Data Architect
Posted 10 days ago
Job Viewed
Job Description
Overview
We are looking for an experienced Data Architect with 12+ years of progressive experience in data engineering, architecture design, and technical leadership. You will lead the architectural strategy and execution for our real-time and batch data platforms, ensuring scalability, reliability, and performance across a modern cloud-native ecosystem. This role is suited for a visionary technologist who can drive enterprise-grade solutions, establish best practices, and mentor high-performing engineering teams.
Key Responsibilities- Define and evolve the data architecture roadmap , ensuring alignment with business goals and scalability of data platforms.
- Architect real-time streaming and batch data pipelines using Kafka, Apache Flink, and Spark Structured Streaming.
- Lead the design of lakehouse architectures leveraging technologies such as Apache Spark (PySpark), Delta Lake, Iceberg, and Parquet.
- Establish governance, security, and compliance frameworks across data platforms and pipelines.
- Set standards for data orchestration using tools like Apache Airflow and Azure Data Factory.
- Guide the containerization strategy with Docker, Kubernetes, and Helm for scalable deployment of data applications.
- Architect and optimize analytics engines such as Trino or Presto for high-performance query execution.
- Define and implement observability frameworks to monitor SLAs, lineage, data quality, and pipeline health.
- Collaborate with DevOps on infrastructure-as-code (IaC) using Terraform and CI/CD automation through Azure DevOps.
- Establish and advocate best practices for coding standards, modular development, and testing in Python and SQL.
- Collaborate cross-functionally with product owners, data scientists, platform teams, and engineering leads.
- Lead architecture reviews, POCs, and technology evaluations to guide technology adoption and innovation.
- Mentor and guide senior and mid-level data engineers across solutioning, technical decisions, and design patterns.
- Bachelor’s or Master’s degree in Computer Science, Information Systems, or a related discipline.
- 12+ years of experience in data engineering and architecture, with at least 3+ years in a lead/architect role.
- Must have the ability to take things to completion overcoming challenges like unavailability of stakeholders or information.
- Expert-level knowledge in Python , SQL , and scalable distributed systems.
- Deep expertise in streaming architectures with Kafka, Flink, and Spark Streaming.
- Strong experience designing and implementing batch data workflows using Apache Spark (PySpark).
- In-depth knowledge of cloud data platforms , particularly Azure (preferred).
- Extensive experience with data orchestration tools like Airflow and Azure Data Factory.
- Proven ability to design and implement data lakehouse architectures with formats like Parquet, Delta Lake, and Iceberg.
- Proficiency in containerization (Docker), orchestration (Kubernetes), and deployment via Helm.
- Solid experience implementing observability, lineage, and data quality frameworks , such as Great Expectations.
- Strong background in infrastructure as code (Terraform) and CI/CD integrations.
- Deep understanding of data governance , metadata management, and data privacy standards.
- Excellent communication skills with the ability to influence stakeholders and present architectural strategies to leadership.
- Experience working in agile development environments and fostering team-wide engineering excellence.
- Experience with large-scale API architecture and scalable data access layers is a plus.
- Familiarity with AI/ML-integrated data systems or tools (e.g., text-to-SQL) is a bonus.
We have an amazing team of 700+ individuals working on highly innovative enterprise projects & products. Our customer base includes Fortune 100 retail and CPG companies, leading store chains, fast-growth fintech, and multiple Silicon Valley startups.
What makes Confiz stand out is our focus on processes and culture. Confiz is ISO 9001:2015, 27001:2013 & 2000-1:2018 certified. We have a vibrant culture of learning via collaboration and making the workplace fun.
People who work with us work with cutting-edge technologies while contributing success to the company as well as to themselves.
#J-18808-LjbffrData Architect
Posted 10 days ago
Job Viewed
Job Description
We are looking for an experienced
Data Architect
with 12+ years of progressive experience in data engineering, architecture design, and technical leadership. You will lead the architectural strategy and execution for our real-time and batch data platforms, ensuring scalability, reliability, and performance across a modern cloud-native ecosystem. This role is suited for a visionary technologist who can drive enterprise-grade solutions, establish best practices, and mentor high-performing engineering teams. Key Responsibilities
Define and evolve the
data architecture roadmap , ensuring alignment with business goals and scalability of data platforms. Architect
real-time streaming
and
batch data pipelines
using Kafka, Apache Flink, and Spark Structured Streaming. Lead the design of
lakehouse architectures
leveraging technologies such as Apache Spark (PySpark), Delta Lake, Iceberg, and Parquet. Establish
governance, security, and compliance
frameworks across data platforms and pipelines. Set standards for
data orchestration
using tools like Apache Airflow and Azure Data Factory. Guide the containerization strategy with Docker, Kubernetes, and Helm for scalable deployment of data applications. Architect and optimize
analytics engines
such as Trino or Presto for high-performance query execution. Define and implement observability frameworks to monitor SLAs, lineage, data quality, and pipeline health. Collaborate with DevOps on
infrastructure-as-code (IaC)
using Terraform and CI/CD automation through Azure DevOps. Establish and advocate best practices for
coding standards, modular development, and testing
in Python and SQL. Collaborate cross-functionally with product owners, data scientists, platform teams, and engineering leads. Lead architecture reviews, POCs, and technology evaluations to guide technology adoption and innovation. Mentor and guide senior and mid-level data engineers across solutioning, technical decisions, and design patterns. Required Skills & Experience
Bachelor’s or Master’s degree in Computer Science, Information Systems, or a related discipline. 12+ years of experience in data engineering and architecture, with at least 3+ years in a lead/architect role. Must have the ability to take things to completion overcoming challenges like unavailability of stakeholders or information. Expert-level knowledge in
Python ,
SQL , and scalable distributed systems. Deep expertise in
streaming architectures
with Kafka, Flink, and Spark Streaming. Strong experience designing and implementing
batch data workflows
using Apache Spark (PySpark). In-depth knowledge of
cloud data platforms , particularly Azure (preferred). Extensive experience with
data orchestration
tools like Airflow and Azure Data Factory. Proven ability to design and implement
data lakehouse architectures
with formats like Parquet, Delta Lake, and Iceberg. Proficiency in
containerization
(Docker), orchestration (Kubernetes), and deployment via Helm. Solid experience implementing
observability, lineage, and data quality frameworks , such as Great Expectations. Strong background in
infrastructure as code
(Terraform) and CI/CD integrations. Deep understanding of
data governance , metadata management, and data privacy standards. Excellent communication skills with the ability to influence stakeholders and present architectural strategies to leadership. Experience working in
agile development environments
and fostering team-wide engineering excellence. Experience with large-scale API architecture and scalable data access layers is a plus. Familiarity with AI/ML-integrated data systems or tools (e.g., text-to-SQL) is a bonus. We have an amazing team of 700+ individuals working on highly innovative enterprise projects & products. Our customer base includes Fortune 100 retail and CPG companies, leading store chains, fast-growth fintech, and multiple Silicon Valley startups. What makes Confiz stand out is our focus on processes and culture. Confiz is ISO 9001:2015, 27001:2013 & 2000-1:2018 certified. We have a vibrant culture of learning via collaboration and making the workplace fun. People who work with us work with cutting-edge technologies while contributing success to the company as well as to themselves.
#J-18808-Ljbffr
Data Architect
Posted 14 days ago
Job Viewed
Job Description
Data Architect
with 12+ years of progressive experience in data engineering, architecture design, and technical leadership. You will lead the architectural strategy and execution for our real-time and batch data platforms, ensuring scalability, reliability, and performance across a modern cloud-native ecosystem. This role is suited for a visionary technologist who can drive enterprise-grade solutions, establish best practices, and mentor high-performing engineering teams. Key Responsibilities: Define and evolve the
data architecture roadmap , ensuring alignment with business goals and scalability of data platforms. Architect
real-time streaming
and
batch data pipelines
using Kafka, Apache Flink, and Spark Structured Streaming. Lead the design of
lakehouse architectures
leveraging technologies such as Apache Spark (PySpark), Delta Lake, Iceberg, and Parquet. Establish
governance, security, and compliance
frameworks across data platforms and pipelines. Set standards for
data orchestration
using tools like Apache Airflow and Azure Data Factory. Guide the containerization strategy with Docker, Kubernetes, and Helm for scalable deployment of data applications. Architect and optimize
analytics engines
such as Trino or Presto for high-performance query execution. Define and implement observability frameworks to monitor SLAs, lineage, data quality, and pipeline health. Collaborate with DevOps on
infrastructure-as-code (IaC)
using Terraform and CI/CD automation through Azure DevOps. Establish and advocate best practices for
coding standards, modular development, and testing
in Python and SQL. Collaborate cross-functionally with product owners, data scientists, platform teams, and engineering leads. Lead architecture reviews, POCs, and technology evaluations to guide technology adoption and innovation. Mentor and guide senior and mid-level data engineers across solutioning, technical decisions, and design patterns. Required Skills & Experience: Bachelor’s or Master’s degree in Computer Science, Information Systems, or a related discipline. 12+ years of experience in data engineering and architecture, with at least 3+ years in a lead/architect role. Must have the ability to take things to completion overcoming challenges like unavailability of stakeholders or information. Expert-level knowledge in
Python ,
SQL , and scalable distributed systems. Deep expertise in
streaming architectures
with Kafka, Flink, and Spark Streaming. Strong experience designing and implementing
batch data workflows
using Apache Spark (PySpark). In-depth knowledge of
cloud data platforms , particularly Azure (preferred). Extensive experience with
data orchestration
tools like Airflow and Azure Data Factory. Proven ability to design and implement
data lakehouse architectures
with formats like Parquet, Delta Lake, and Iceberg. Proficiency in
containerization
(Docker), orchestration (Kubernetes), and deployment via Helm. Solid experience implementing
observability, lineage, and data quality frameworks , such as Great Expectations. Strong background in
infrastructure as code
(Terraform) and CI/CD integrations. Deep understanding of
data governance , metadata management, and data privacy standards. Excellent communication skills with the ability to influence stakeholders and present architectural strategies to leadership. Experience working in
agile development environments
and fostering team-wide engineering excellence. Experience with large-scale API architecture and scalable data access layers is a plus. Familiarity with AI/ML-integrated data systems or tools (e.g., text-to-SQL) is a bonus. We have an amazing team of 700+ individuals working on highly innovative enterpriseprojects & products. Our customer base includes Fortune 100 retail and CPGcompanies, leadingstore chains, fast-growth fintech, and multipleSilicon Valley startups. What makes Confiz stand out is our focus on processes and culture. Confiz is ISO 9001:2015,27001:2013 & 2000-1:2018 certified. We have a vibrant culture of learning via collaboration and making the workplace fun. People who work with us work with cutting-edge technologies while contributing success to the company as well as to themselves. To know more about Confiz Limited, visit: