Summary
Overview
Work History
Education
Skills
Timeline
Generic

Manpreet Kaur Chandi

Papakura,New Zealand

Summary

Over 7 plus years of total IT experience in the Design, Development, Implementation and Support of SQL Server 201/2014/2016. Skilled in Data Extraction, Data cleansing, Data Transformation and Data Loading (ETL) between SQL Server, Flat Files, Excel Files. Experience in importing/exporting data between different sources like SQL/Excel etc. using SSRS. Expertise in using global variables, expressions and functions for the reports with immense experience in handling sub reports in SSRS. Experienced in installing and configuring Reporting services and assigning permissions to different levels of users in SSRS. Experience in SSIS packages with collecting data from various sources. Experience with SSIS data conversion, aggregate, Merge join , conditional split. Deployed reports in the report server and implemented subscription functionality. Involved in all phases of a Data Migration projects during tenure that include Scope Study, Requirements Gathering, Design, Development, Implementation, Acceptance testing for end-to-end IT solution offerings. Ability to handle Multiple tasks, Initiative and Adaptable. Self-motivated, organized team player with strong problem solving and analytical skills and total commitment to the organization goals. Experience in HTML, CSS. Experience in debugging an application using the debugging tools provided by the Visual Studio.Net IDE. Good ability to quickly grasp and master new concepts and technologies Knowledge about AWS S3, EC2, Database Knowledge of Azure functionalities Worked with SSAS cubes , Snowflake , star schema. Worked with Power BI Tool for various requirements. Developing SQL queries to research, analyse, and troubleshoot data and to create business reports. Experience in using Microsoft BI studio products like SSIS,SSAS,SSRS for implementation of ETL methodology in data extraction, transformation and loading. Knowledge in analyzing Data warehouse by building cubes using SQL Server Analysis Services SSAS . Created snowpipe for continuous data load in-depth knowledge of snowflake. In-depth knowledge of DBT. Worked with snowflake features like zero cloning , time travel , Roles , privileges , designing virtual warehouses and creating external tables. Created data sharing between two snowflake accounts. Created external staging tables on snowflake Loaded data from SQL servers on snowflake using stitch.

Results-driven data engineering professional with solid foundation in designing and maintaining scalable data systems. Expertise in developing efficient ETL processes and ensuring data accuracy, contributing to impactful business insights. Known for strong collaborative skills and ability to adapt to dynamic project requirements, delivering reliable and timely solutions.

Overview

2026
2026
years of professional experience

Work History

Data Engineer

National Informatics Center
  • Evaluate client information requirements
  • Created SSRS reports using Report Parameters, Drop-Down Parameters, Multi-Valued Parameters Debugging Parameter Issues Matrix Reports and Charts
  • Gathered software requirements from clients and end users
  • Performing query plans and making sure each and every query is using appropriate indexes
  • Worked Extensively on ETL process using SSIS package
  • Hands on experience in Creating Star Schema and Snowflake Schemas
  • Experience in creating Parameterized reports and Linked reports with thorough knowledge of report serving architecture
  • (Table, chart and matrix report)
  • Involved in creating Star schema cubes using SSAS
  • Worked on OLAP cubes using SSAS
  • Wrote store procedure and SQL queries
  • Handling large data sets and reports
  • Create and modify SSRS reports
  • Work on SQL Server indexes to enhance the performance
  • Used HTML,CSS for designing and developing the application using MVC UI
  • Gathered business requirements and converted it into SQL stored procedures for database specific projects
  • Created files, views, tables and data sets to support Sales Operations
  • Created and scheduled SQL Agent jobs and maintenance plans
  • Worked with flat files , excel files with SSIS
  • Environment: ASP.NET, C#, MVVM, Oops, HTML, CSS, SQL Server
  • Collaborated on ETL (Extract, Transform, Load) tasks, maintaining data integrity and verifying pipeline stability.
  • .Fine-tuned query performance and optimized database structures for faster, more accurate data retrieval and reporting.
  • Enhanced data quality by performing thorough cleaning, validation, and transformation tasks.
  • Streamlined complex workflows by breaking them down into manageable components for easier implementation and maintenance.
  • Optimized data processing by implementing efficient ETL pipelines and streamlining database design.
  • Migrated legacy systems to modern big-data technologies, improving performance and scalability while minimizing business disruption.
  • Provided technical guidance and mentorship to junior team members, fostering a collaborative learning environment within the organization.
  • Increased efficiency of data-driven decision making by creating user-friendly dashboards that enable quick access to key metrics.
  • Designed scalable and maintainable data models to support business intelligence initiatives and reporting needs.

Software engineer (as an Intern)

EICE international Pvt. Ltd Noida
  • Evaluate client information requirements
  • Developed the Classes, Objects, Dataset Classes, and Methods depending upon the business requirements
  • Data retrieval and manipulation using ADO.NET data objects
  • Wrote store procedure and SQL queries
  • Used HTML, CSS for designing and developing the application using ASP.NET, MVVM,ASPX view engine UI
  • Environment: ASP.NET, C#, MVVM, Oops, HTML, CSS, SQL Server

Data Engineer

Woolworths New Zealand
09.2022 - Current
  • Worked with pipelines for customer data
  • Working on three major projects
  • Data migration, New loyalty programme data and secure data (PII)
  • Proficient in using DBT( data build tool) for data modelling , transformation
  • Strong understanding of relational databases, SQL
  • Familiarity with GCP, worked in big query projects
  • Implemented best practices for data modelling, schema design and documentation
  • Worked with Apache airflow for workflow orchestration and scheduling
  • Working on 3 different projects
  • Used python for creating data pipelines
  • Worked on data migration from data mart to big query (GCP) using DBT
  • Created Design for implementation of different zones of data layers
  • Ingesting data from data mart to Big query using DBT
  • Worked on project with real time loyalty data for countdown replacement of one card
  • Used DBT , SQL , Big query, airflow
  • Collaborating with cross-functional teams to understand data requirements and deliver AWS-based data solutions
  • Monitoring and troubleshooting data systems and pipelines on AWS using services like CloudWatch
  • Optimizing data processing and storage for performance and cost-efficiency on AWS
  • Designing, building, and maintaining data pipelines and ETL processes on AWS using services like AWS Glue, AWS Lambda, and AWS Step Functions
  • Collaborated on ETL (Extract, Transform, Load) tasks, maintaining data integrity and verifying pipeline stability.
  • .Fine-tuned query performance and optimized database structures for faster, more accurate data retrieval and reporting.
  • Enhanced data quality by performing thorough cleaning, validation, and transformation tasks.
  • Streamlined complex workflows by breaking them down into manageable components for easier implementation and maintenance.
  • Migrated legacy systems to modern big-data technologies, improving performance and scalability while minimizing business disruption.
  • Optimized data processing by implementing efficient ETL pipelines and streamlining database design.
  • Increased efficiency of data-driven decision making by creating user-friendly dashboards that enable quick access to key metrics.
  • Designed scalable and maintainable data models to support business intelligence initiatives and reporting needs.
  • Evaluated various tools, technologies, and best practices for potential adoption in the company''s data engineering processes.
  • Collaborated with cross-functional teams for seamless integration of data sources into the company''s data ecosystem.
  • Conducted extensive troubleshooting to identify root causes of issues and implement effective resolutions in a timely manner.
  • Developed database architectural strategies at modeling, design, and implementation stages to address business or industry requirements.

Data Engineer

Trustpower powered by Mercury
12.2021 - 09.2022
  • Involving in the daily status meeting and interacting with onshore team through mails/calls to follow up the module and to resolve the data/code issues
  • Using DBT for ELT Process
  • Data Transformation using DBT and SNOWFLAKE
  • Support Analyst With data and processes in the creation of reports
  • Creation of Schema on snowflake
  • Used Azure EFT/blob storage, Stitch for loading of data into snowflake
  • Involved in Data Extraction, Staging, Targeting Transformation and Loading
  • Experience with container-based deployments using Docker, working with Docker images
  • Familiar with Azure data factory and Azure devops
  • Used Gitkraken for creating pull request /managing pull requests
  • Closely worked with analysts
  • Knowledge of AWS
  • In-depth knowledge of snowflake and DBT
  • Worked on different source of data , loaded data into snowflake using DBT
  • Worked with time travel , cloning on snowflake
  • Created macro on DBT for freshness test
  • Define roles , privileges on snowflake for different users to access the schema/database objects

Data Engineer /SQL developer

National Informatics Centre
01.2020 - 08.2021
  • Understand and analyze expectations of agents
  • Made complex Store procedures and functions
  • Created various SSRS reports including, drilldown, drill through, Matrix Parameterized reports
  • Experience in developing Database Schemas / Stored Procedures / full Backups / Db restoration
  • Responsible for Deploying the SSIS packages and scheduling the package through Jobs in all Tiers including Dev/test/production
  • Planning and estimation of deliverables
  • Analysis of the requirements for the development of the reports
  • Providing guidance for the team in technical and functional areas
  • Developed SQL scripts to Insert/Update and Delete data in MS SQL database tables
  • Created various database objects including Tables, User Defined Functions, Procedures, Triggers, Indexes and Views based on user requirements
  • Created Mapping documents for data extraction, transformation and loading
  • Managed Data quality & integrity using skills in Data Warehousing, Databases & ETL
  • Generated and formatted Reports using Global Variables, Expressions and Functions for the reports
  • Used various SSIS control flow tasks like Loop Containers, SQL Server Agent jobs, execute sql , For Each Loop Container etc
  • And different data flow tasks like Conditional split, Derived Column, lookup, merge join , Multicast, Union all, data conversion etc
  • Designed BI dashboards and worked with power BI query editor
  • Have good knowledge of data visualization, graphs and charts using power Bi
  • Worked with Snowflake
  • Created External stage tables on snowflake
  • Worked on cloning and time travel on snowflake
  • Worked with different data sources
  • Define virtual warehouse sizing for snowflake for different types on workloads
  • Environment: HTML, CSS, SQL Server 2014 and SSRS reports, SSIS, SSAS , Basic Power Bi

Education

B.Tech - CSE

Uttarakhand technical university
01.2018

Skills

  • Languages: SQL, HTML, CSS
  • Framework: NET,JAVA
  • Reporting Tools: MS SQL Server Reporting Services (SSRS)
  • Cloud: Snowflake, Snowpipe ,azure blob Storage, EFT , Azure , AWS
  • ETL TOOLS: SQL Server Integration Services SSIS, Analysis Services SSAS, Power Bi , Stitch (talend), DBT
  • Other Tools: Gitkraken , GitHub
  • Databases: Microsoft SQL Server 2016, 2014/2016
  • OS and Platforms: Windows 7, XP, 10 Applications: MS-Word, MS-Excel , Jira
  • Training: SNOWFLAKE fundamental Training hosted by snowflake
  • Orchestration: airflow , dbt cloud orchestration
  • ETL development
  • Data warehousing
  • Data modeling
  • Data migration
  • SQL expertise
  • Data analysis
  • Database design

Timeline

Data Engineer

Woolworths New Zealand
09.2022 - Current

Data Engineer

Trustpower powered by Mercury
12.2021 - 09.2022

Data Engineer /SQL developer

National Informatics Centre
01.2020 - 08.2021

Software engineer (as an Intern)

EICE international Pvt. Ltd Noida

Data Engineer

National Informatics Center

B.Tech - CSE

Uttarakhand technical university
Manpreet Kaur Chandi