I have been in the software development field from last 7.10 years. I have provided complete life cycle software
development including requirements definition, analysis, design, development, testing, implementation, and support to
different clients. I have experience as a software developer, providing motivation and guidance to my offshore team.
.
Roles & Expertise:
DP-900(Microsoft Certified: Azure Fundamentals).
Received Wall of fame multiple times at Wipro DAAIpractice level.
Interested to work on Realistic Problems, Hungry to find solution for impossible cases.
Actively connected with onsite Lead and Client to understand requirements and develop User friendly system.
Highly experienced in the Agile for daily stand up, planning, managing and demo.
Creation and modification of technical specification and design documents.
As an offshore team member understood the design, development and implementation phases from onsite lead
and explained to other juniors.
Good knowledge of Dimension, Facts, Star schema,Snowflakeschema,OLTP and OLAP.
Good Knowledge with Hands on of Python(LIST,TUPLE,DICTIONARIES,FUNCTIONS,EXCEPTION
HANDLING, INHERITANCE,FILE HANDLING).
Strong Analytical, Problem solving and communication skills.
Strong communication and inter-personal skills at the technical and user level.
Ability to work individually as well as in a team with excellent problem solving and troubleshooting capabilities.
Ability to learn new tools, concepts, and environments.
Good team player, Self-motivated, quick learner.
Overview
9
9
years of professional experience
Work History
Data Analyst
MVP STUDIO
08.2024 - Current
MVP Studio is a startup incubator that undertakes software product development for Startups and Enterprises
I am
currently working on a cloud-based Property Investment and Management Platform, specifically targeting the Data
Integration of Australia Property data
I am also designing the Data Warehouse solution for the visualization and
reporting services
Roles & Responsibility:
Utilized MSSQL Server, SSIS, Power Query, and Power BI to develop a cloud-based Property Investment and
Management Platform for Data Integration of Australia Property data
Implemented query optimization of SSRS reports and stored procedures to enhance performance
Integrated
property data from multiple sources to build a complete dataset
Creating and maintaining SSRS reports using a data warehouse as the data source
Integrating property data from multiple government data sources to build a complete data set
Design of the Data Warehouse and SSIS package consistent with the existing design
Creation of SQL queries, stored procedures and functions to implement medium-to-complex business logic
for reporting and analysis purposes
Created interactive Power BI dashboards to provide actionable insights for property data analysis
Developed
and implemented data pipelines using MSSQL Server and Power Query for efficient data integration
Used Kimball methodology (Star schema and Snowflake schema)
Senior Analyst
Course5 Intelligence
Bangalore
09.2022 - 07.2023
Lead the team
Involved in understanding functional requirements with business and other team members
Data Solution Design: Collaborate with stakeholders to understand business requirements and design
scalable and efficient data solutions on Azure
This includes selecting appropriate Azure services and
components to build data pipelines and storage solutions
Data Pipeline Development: Develop and implement data ingestion and transformation pipelines using
Azure Data Factory, Azure Databricks, or other relevant tools
Extract data from various sources,
perform data cleansing, transformation, and enrichment, and load it into data lakes or data
warehouses
Data Modeling and Warehousing: Design and implement data models for data warehousing solutions using
Azure Synapse Analytics, Azure SQL Data Warehouse, or similar technologies
Optimize data structures for
performance and scalability and ensure data integrity and consistency
Creating end to end pipelines using Azure Data factory for the on-demand cluster running and creating
the link services between the source and destination
Creating Databricks clusters and attaching with the notebook to transform the raw data to refined data
using python.
WIPRO Limited
Bangalore
09.2015 - 05.2022
Senior Data Analyst
Chevron Inc
02.2020 - 05.2022
Chevron Corporation is an American multinational energy corporation
One of the successor companies of Standard
Oil, it is headquartered in San Ramon, California, and active in more than 180 countries
Chevron is engaged in every
aspect of the oil, natural gas, including hydrocarbon exploration and production; refining, marketing and transport;
chemicals manufacturing and sales; and power generation
Responsible for extracting the data from various Azure data storage services and loading it into cloud data
stores for further transformations
Creating end to end pipelines using Azure Data factory for the on-demand cluster running and creating thelink
services between the source and destination
Knowledge transfer to new team members to gain application knowledge and to support team members togain
the application knowledge on the wave of changes about to come in their way
Extracted data from various Azure data storage services and loaded it for further transformations using Azure Data
Factory
Conducted knowledge transfer sessions to support team members in adapting to upcoming changes
Designed and implemented data models for data warehousing solutions for Azure Synapse Analytics, optimizing data
structures for performance and scalability.
Capital One Bank
12.2017 - 02.2020
Capital One Financial Corporation is an American bank holding company specializing in credit cards, auto
loans, banking, and saving accounts, headquartered in McLean Virginia with operations primarily in the United
States
It is on the list of largest banks in the United States, is the third largest issuer of Visa and Mastercard
credit cards and is one of the largest car finance companies in the United States.
Senior Data Engineer
Charles Schwab, The Charles Schwab
02.2017 - 11.2017
Responsibility:
Worked in Agile/Scrum environment using JIRA across Rationalization, Migration, Cutover and Decommission
phases
Prepared analysis documents for pre-prod, cutover and decommission phases
Arow scheduler was introduced new to the entire team
Actively participated in KT sessions from the required
AROW team and then transitioned the knowledge to the entire team
Prepared the Change Orders with the necessary documents
Delivered more than the required velocity of EME migration/cutover per sprint
Reviewing the graph changes and ensuring there is no tag differences
Managed incidents across all the PODs and prepare incident classification report for the Program Manager
Health and hygiene check and creating reports for PO (Product analysis)
Worked End to End Migration from On-Prem server to Cloud server
Corporation is an American multinational financial services company
It offers banking,
commercial banking, an electronic trading platform, and wealth management advisory services to both retail and
institutional clients.
Data Engineer
Dun & Bradstreet
12.2015 - 01.2017
Worked on assignments in Agile mode, each sprint spanning 2 weeks
Reusable joblet for deleting stale files: Developed a reusable joblet that detects files older than a certain
number of days (received as input through context variable) to be deleted from the target path
This was
included to the Flatfile ingestion and MongoDB ingestion jobs
RabbitMQ ingestion framework: Designed and developed a generic job in Talend for Near Realtime
Ingestion of messages from Rabbit MQ
It is a configurable job, used as a framework that runs 24/7 and
polls messages from a specified message queue
Optimization of job to load high volume data: Created an optimized version of the Teradata to Hive job to
load tables with high volume of data
To achieve this, the job was made to recursively call a child job which
would load the data in chunks Client, Dun & Bradstreet helps companies leverage data and analytical insights to take more intelligent actions deliver
a competitive edge.Data and insights are delivered through the DUN & BRADSTREET Data Cloud and the
solutions it powers., Roles &Responsibility:
Done various POC for the upcoming projects
Requirement identification and analysis
Design, coding, deployment
Code review and documentation review
Create design and technical specifications documents.
Education
M.tech - Software Engineering
Birla Institute of Technology and science
B.C.A(Bachelor of Computer Applications) -
Himachal Pradesh University
Skills
Technical Expertise
BUSINESS INTELLIGENCE
Microsoft BI suite, SSIS, SSAS, SSRS, DAX and Power Query
DATABASES
MSSQL Server, SQL, Stored Procedures, Functions, Performance and Data Warehouse design
VISUALISATION
SSRS, Power BI, Tableau
OTHER TECHNOLOGIES
Talend,CDRS, Ab-Initio, Teradata,Unix, AWS, snowflake, Azure Data factory, Blob Storage, Data Lake Gen1,2,Azure