Find the one for the job
- Step 1 Search IT resource by skill set
- Step 2 Show interest
- Step 3 Schedule interview
- Step 4 Hire hassle free with Bench Geeks
Filter By
Experience 6 Years - 8 Years
Work Style Remote
Duration Immediate
Location Pune
Project Portfolio
Please review the project portfolio hereSkill Set
- Aws
- Azure
- Sql
CAN00284
Experience 6 Years - 8 Years
Work Style Remote
Duration Immediate
Location Pune
Project Portfolio
|
Duration |
Jan 2022 – Dec 2024 |
|
Project Profile |
NBFC Analytics & MIS Automation |
|
SENIOR DATA ENGINEER |
· Developer PySpark scripts to clean data & implement transformations to implement the business rules. · Partitioned data in delta lake tables for optimized querying. · Reduced manual data processing by 80%, resulting in a more reliable and faster reporting system that improved data-driven decision-making. · Worked on creating ADF workflows for automated & scheduled ingestion of data from sources. · Wrote python scripts to import files from SFTP to Azure Blob Storage. · Wrote complex SQL scripts for data testing to verify data from source to the final consumption views and making sure they align with the business rules. · Creating a cloud data lake strategy for migrating the existing setup to cloud for easier administration, faster analytics and lesser infrastructure costs. · Dimension modelling for the data warehouse for optimal storage, performance, scalability and easier programming of Analytical queries. · Documented complex logics implemented in transformations for keeping track of functional implementations. · Created RCA documents to track the issues occurring in higher environments and suggesting short term and long-term solutions. · Python code was writing to perform manual excel actions into an automated process which was helpful for business users to compare data with the data generated in the Data Lakehouse. |
|
Software/ Special Tools |
SQL, Azure Databricks, Azure Data Lake Storage, Python, Azure Data Factory & MS SQL Server |
|
Duration |
Feb 2019-Dec 2021 |
|
Project Profile |
Financial Planning & Quantitative Analytics |
|
Data Engineer |
· Participated in discussions with business teams along with technical leads and architects to understand the business scenarios better which I was working on. · Implemented ADF pipelines to ingest data from various sources like SQL Server, CSV files & Excel files and to store it in Azure Blob Storage. · Implemented functional data quality checks using SQL to make sure it doesn’t break while the transformation takes place later in the model code. · Performed troubleshooting of Azure Data Factory pipelines based on bugs reported by the report developers and analysts who used the data in the data warehouse.
|
|
Software/ Special Tools |
Snowflake, T-SQL, MS Excel, Azure Data Factory, Azure Blob Storage & Python |
|
Duration |
Nov 2017-Feb 2019 |
|
Project Profile |
Insurance Analytics & Data Automation |
|
Junior Data Engineer |
· Created data ingestion scripts using “pandas” library in python for external files which contained data from market research companies which was further used in the data warehouse. · Worked on the data validation framework which used various checks to make sure that source data which has landed in staging database checking for NULLs, empty strings, masked data, bogus data and any other form of data distortions. · Created SQL stored procedures which performed complex calculations based on logic provided by the business analysts. This was then plugged into the SSIS framework by the team lead which would further transform data which was in staging. · Created views on top of the SQL Server relational Data Warehouse which contained few transformation logics. · Used the views to load data into the Power BI Dashboards which was then used to create ad-hoc dashboards for analytics, presentations and brainstorming before the final dashboard could be implemented in the custom application using amcharts by app developers. · Handled changes in the existing systems after analyzing the impact of the change across the system and other systems using data from the data warehouse. · Worked with technical support teams for few months to troubleshoot data cases reported from clients and users. · Was involved in creating the data testing framework in python which was later implemented to test the data loaded in data warehouse after the nightly jobs got finished. This contained various user defined functions to check cases like counts, date ranges, values between ranges, other dimension counts and functional scenarios and algorithms. |
|
Software/ Special Tools |
Microsoft SQL Server, T-SQL, Power BI, & MS Excel, Python |
Skill Set
- Aws
- Azure
- Sql
Experience 10+ Years
Work Style Remote
Duration Immediate
Location Pune
Project Portfolio
Please review the project portfolio hereSkill Set
- Aws
- Azure
- Python
CAN00283
Experience 10+ Years
Work Style Remote
Duration Immediate
Location Pune
Project Portfolio
|
Project 1 |
|||
|
Project Name |
SOMPO |
Team Size |
11 |
|
Start Date |
DEC-2022 |
End Date |
Nov-2023 |
|
Project Description |
Business objective is that we need to Create and maintain Various Data Pipelines In Azure Environment using Python and Pyspark scripts
|
||
|
Role & Contribution |
• Worked on handling the Azure Data Pipeline and writing python,pyspark and sql scripts on Azure Databricks • Worked on monitoring and error handling using Control-M • Currently Managing a Team of around 15 members for this data support and development activity
|
||
|
Technology &Tools |
AZUREDATABRICKS,AZURE DEVOPS,PYTHON,SPARK,SQL, AWS COMPONENTS,BITBUCKET,JIRA,CONTROLM |
||
|
|
|||
|
Project 2 |
|||
|
Project Name |
SGC INDIA |
Team Size |
11 |
|
Start Date |
AUG-2022 |
End Date |
DEC2022 |
|
Project Description |
Business objective of the project is to build a Datalake- S3 for the Customer - ML Processing and coordinating with the other team members working on the Front end side |
||
|
Role & Contribution |
• Worked on the AWS services like AWS Glue , AWS Lambda and Terraform for implementing the Datalake through S3 and Cassandra • Writing Pyspark and Python scripts for implementing the transformation logic and get approved by the client • Leading the team on the Scrum call for daily updates
|
||
|
Technology &Tools |
HDFS,HBASE, PYTHON,SPARK, AWS COMPONENTS,BITBUCKET,JIRA |
||
|
|
|||
|
Project 3 |
|||
|
Project Name |
Lytx |
Team Size |
11 |
|
Start Date |
MAR-2022 |
End Date |
AUG-2022 |
|
Project Description |
Business objective of the project is to create and deploy a AWS pipeline for one of the fleet management area |
||
|
Role & Contribution |
• Created a Physical Architecture for the Entire Flow • Assisting the Data Engineers for the successful Implementation of the architecture • Leading the team on the Scrum call for daily updates
|
||
|
Technology &Tools |
HDFS, PYTHON,SPARK, AWS COMPONENTS |
||
|
|
|
||
|
Project 4 |
|
||
|
Project Name |
Institute of Electrical and Electronics Engineers (IEEE) |
Team Size |
18 |
|
Start Date |
MAR-2020 |
End Date |
MAR-2022 |
|
Project Description |
Business objective of the project is to create a framework and perform backend activities using AngularJS, Python |
||
|
Role & Contribution |
• Managing a Team of around 18 members for the Successful delivery of the project • Created a Physical Architecture for the Entire Flow • Responsible for Creating Grouping Data logic on conditions using Pyspark,Python and connecting with ML Team for creating Algorithm • Finalizing the user screens and assisting the developers both front end and back end on Functional basis • Co-ordinating with Scrum master for the creation of user stories in Jira • Helping the team for creating test cases • Daily Interaction with Higher management & Client to justify our Resource Efforts and discussing about the plan for the forth coming Sprint
|
||
|
Technology &Tools |
HDFS, PYTHON,SPARK, AWS COMPONENTS,ANGULARJS, |
||
|
|
|
||
|
Project 5 |
|
||
|
Project Name |
Portland Group Electric (PGE) |
Team Size |
8 |
|
Start Date |
JUNE-2020 |
End Date |
MAR-2020 |
|
Project Description |
Skill Set
Project PortfolioPlease review the project portfolio hereSkill Set
T*r*n*
CAN00282 node js developer
Project PortfolioName: CulturaLinkRoles: ReactJs Developer
Project Descriptions:
Integrated Power BI embed functionality for advanced analytics, allowing users to delve deeper into insights and trends within the platform interface.
Conducted thorough testing and debugging to ensure a smooth and error-free user experience.
Tools & Technologies:React.js, Node.js, Redux, Twilio, Socket.IO, Graphql, Typescript, Bootstrap.
Name: WorkTok (Job Marketplace)Roles: ReactJs Developer Project Descriptions:
Utilized authentication techniques to authenticate users' identities and prevent unauthorized access.
Designed subscription packages to cater to various needs and preferences of service providers.
Implemented a feature for administrators to manually validate service providers by reviewing their uploaded documents, ensuring compliance and authenticity.
Enabled admins to manage provider requests, including approval, rejection, and verification processes. Provided tools for admins to oversee and manage approved providers, ensuring quality and reliability.
Developed functionalities for administrators to manage customer accounts and activities.
Enabled administrators to manage jobs created by customers, including review, approval, and moderation processes.
Implemented a notification system allowing administrators to send messages to all customers and providers, facilitating communication and updates.
Created a chat management module within the admin panel, enabling administrators to communicate with both customers and service providers. Provided administrators with the ability to view and manage all chat conversations between users.
Tools & Technologies:React.js, Node.js, Redux, Bootstrap, Socket.IO
Name: ZapifyRoles: Frontend Developer
Project Descriptions:
Tools & Technologies:ReactJS, NodeJS, eDRV, Stripe, MUI
Name: QR Phone (Real-time Communication Platform)Roles: ReactJs Developer
Project Descriptions:
Project PortfolioPlease review the project portfolio hereSkill Set
A*a*
CAN00281 NODE JS developer
Project PortfolioLotto Social A comprehensive lottery application that allows users to buy tickets, participate in games, track winnings, and engage in referral programs for rewards.
Fork Freight A logistics and parcel tracking platform offering businesses real-time shipment tracking and analytics.
Dry Sign A secure digital document management and e-signature platform for businesses.
Fairy Trail A subscription-based dating application providing a secure and intuitive matchmaking experience.
Skill Set
Project PortfolioPlease review the project portfolio hereSkill Set
N*a*s*e*
CAN00280 UI/UX Designer
Project PortfolioJIO TESSARACT Skill Set
Project PortfolioPlease review the project portfolio hereSkill Set
R*v*n*
CAN00279 UI/UX Designer
Project PortfolioTechved Skill Set
Project PortfolioPlease review the project portfolio hereSkill Set
S*w*t*
CAN00278 Fullstack developer
Project PortfolioProject Skill Set
Project PortfolioPlease review the project portfolio hereSkill Set
M*n*j
CAN00277 Fullstack developer
Project PortfolioProject Speedlabs Developed some exciting features likeInvoice,Practice,Test, Skill Set
Project PortfolioPlease review the project portfolio hereSkill Set
P*y*l
CAN00276 Full stack developer
Project PortfolioBuildFix – WT Skill Set
Project PortfolioPlease review the project portfolio hereSkill Set
K*s*m*n*
CAN00275 Flutter developer
Project PortfolioProject #1 : Skill Set
Confirmation
Thank you for showing interest!Check out all the interested candidates under "VIEW APPLICANTS" in your dashboard's Requirement Summary. Continue
| ||
