Tekvaly is looking for a Data Engineer in Canada for its client.
Who We are
Tekvaly is a diversified global software development and IT consulting company that gives both offshore and onshore technical solutions to business enterprises. Our mission is to enable superior returns on clients’ technology investments through best-in-class industry solutions, domain expertise and global scale. We feel deeply connected to our customers, and therefore our success isn’t just a matter of our bottom line, but a reflection of how our customers flourish, and how their communities thrive. We strive to understand our customers’ individual needs so that we can develop products and services that enhance their livelihoods. Our customers are our partners, and when we rise, we rise together.
As a Data Engineer, you will power the future of data by building scalable pipelines, optimizing cloud warehouses, and transforming raw data into actionable insights.
Responsibilities:
- Design, build, and maintain scalable ETL pipelines and data workflows using Apache Airflow, Spark, or similar orchestration tools.
- Manage data warehouses and data lakes (Snowflake, BigQuery, Redshift) to support high-performance analytics and reporting.
- Perform data extraction, transformation, and loading from diverse sources including databases, APIs, and streaming systems such as Kafka.
- Develop and maintain both batch and real-time data pipelines to support analytical and operational use cases.
- Optimize SQL queries and data models for performance at scale by implementing partitioning, indexing, and caching strategies.
- Implement data quality checks, automated data validation, and pipeline monitoring to ensure reliability, accuracy, and SLA compliance.
- Maintain logging and alerting mechanisms to proactively detect pipeline failures.
- Deploy and manage data infrastructure using Docker, Kubernetes, and cloud platforms (AWS, GCP, Azure), and manage infrastructure through Infrastructure-as-Code tools such as Terraform.
- Support dashboard development by providing optimized datasets for Power BI, Tableau, or Looker.
- Collaborate with data analysts, data scientists, and business stakeholders to deliver clean, reliable, and well-structured datasets for analytics and reporting.
Requirements:
- Hands-on experience in data engineering with strong SQL proficiency.
- Hands-on expertise with Python (Pandas, PySpark) and ETL/orchestration tools such as Airflow.
- Experience with cloud data platforms such as Snowflake, BigQuery, or Databricks.
- Familiarity with containerization (Docker) and CI/CD practices for data pipelines.
- Knowledge of streaming technologies such as Kafka or Kinesis for real-time data processing.
- Understanding of data modeling concepts including star schema and dimensional modeling.
- Experience implementing data quality checks and testing frameworks for reliable pipelines.
- Bachelor’s degree in Computer Science, Data Analytics, or a related field preferred.
Soft Skills We Like to See:
- Excellent Communication skills.
- Adaptability and willingness to learn.
- Problem-solving mindset.
- Analytical skills.
- Ability to work in a team environment and collaborate effectively with others.
******************************************************************************************************************************************************
Accommodations will be provided on request for candidates taking part in all aspects of our recruitment and selection process.
We thank all candidates for their interest however only those selected for an interview will be contacted.
*******************************************************************************************************************************************************