Eqbal

AWS Architect - Bigdata
Since 2011, Shaik Abdul Haffeez has been working professionally in development of Multi Cloud data lakes, data warehouse system, Customer Data Platform and Web-based applications using Java/J2ee Technologies, Scala, Python, JavaScript and Bigdata processing Frameworks. Consulting, Designing, building and operationalizing large scale enterprise data solutions using one or more of AWS & GCP data and analytics services. Experience in DevOps processes/tooling, infrastructure as code frameworks, BI Tools, pipeline orchestration. Extensive Experience in implementing SOA applications using web services with XML, SOAP, WSDL, Apache Axis, and JAX-WS/JAX-RS. Architected multiple Data pipelines, end to end ETL and ELT process for Data ingestion and transformation.
CANDIDATE ID
AG001
YEARS OF EXPERIENCE
10
+ years
EXPERT IN
Data Engineering
Machine Learning
Data Visualization
Python
Data Science
Amazon Web Services (AWS)
SQL
JavaScript
Google Cloud Platform
Shell Scripting

Skills

Libraries/APIs

JDBC, Servlets, JSP, Rest and Soap Webservices, GraphQL

Other

Windows, Unix, Hadoop, Map Reduce, Hive, Sqoop, Spark, Apache Kafka, Apache Nifi, HDP, Databricks, Apache Tomcat and Glassfish, Eclipse, Pycharm, Maven, JUNIT, SQL Developer, iReport, SQLYog, GIT, Bitbucket, GitHub, GitLab, PowerBI.

Languages

Python, SQL, Java, HTML, CSS, Java Script, Nodejs, TypeScript and shell scripting.

 

Paradigms

Data Science

 

Platforms

Amazon Web Services (AWS), Google Cloud Platform (GCP), Databricks

 

Framework

Hibernate, MyBatis, Spring boot, Spark, Terraform, CDK, Pulumi.

 

Storage

MYSQL, Oracle, Snowflake

 

Industry Expertise

Media, Steel Industry

Professional experience

Architect - Bigdata

Agilisium                                          

2021 – Current                                  

Project 1

Client : Reliance Steel & Aluminium Co, US                              

Project Type : Web application

Tools : Snowflake, AWS 

Role : Architect

Period : From Sep 2021  

Roles and Responsibilities:

  • Responsible for managing a growing cloud-based data ecosystem and reliability of RSAC corporate data lake and analytics data mart.
  • Provide technical expertise and leadership to Data Architecture team in all phases of work including analysis, design and architecture to develop and implement cutting-edge solutions.
  • Responsible for implementing wrapper solutions as part of data engineering solutions in order to address specific business needs, using the most appropriate techniques, data sources and technologies.
  • Responsible for migrating workloads from on premise to cloud environment.

Technologies: Python, Spark, AWS EMR, Aurora, Glue, S3, Step Functions, Snowflake, AWS Lambda, Event Bridge Rule, AWS CDK, SQL Server and Qlik Sense.

Data Engineer Lead

Agilisium                                   

2016 – 2021

Project 2

Client : Universal Music Group(UMG),US                              

Project Type : Web application

Tools : GCP, AWS 

Role : Specialist

Period : From May 2016  

Roles and Responsibilities:

  • Design and implemented data processing flow using AWS Data Pipeline Workflow and GCP cloud composer.
  • To integrate data from both structured and unstructured data sources to create a single source of truth BI layer and create new models for reporting and advanced analytics
  • Handled different stages of data enrichment to flow through Google Big Query tables without any data discrepancies.

Technologies: Java, MapReduce, Hive, Hue, Spark, AWS EMR, S3, EC2, StepFucntions, Lambda, Spring Boot, Data Pipeline, Python, Spark, Redshift, Quoble, Bigquery, Google cloud storage, Google Dataflow, Apache beam, Cloud Composer (Airflow), Cloud data transfer API, Cloud SQL and Kubernetes.

Data Engineering Lead

2015 – 2016   

Project 3

Client : Katch,US                              

Project Type : Web application

Tools : GCP, AWS 

Role : Specialist

Period : From Jan 2015 

Roles and Responsibilities:

  • Capable of extracting data from an existing database, Web sources or APIs to Data Lake.
  • Implemented Apache Storm topologies to consume data from kafka producers.
  • Implemented Spring Rest Client to consume data from different APIs and loaded data into Data Lake.
  • Created and worked Apache Nifi jobs with incremental load to populate Hive External tables.
  • Extensive experience in writing hive quires to transform raw data from several data sources into forming baseline data
  • Extended Table columns for custom fields as per business requirements.
  • Built a suite of Linux scripts as a framework for Monitoring daily jobs and log management system.
  • Provided solutions for improve query performance in Hive.

Technologies: HDP, Java, Spring, Apache Kafka, Apache Storm, Druid, MapReduce, Hive, Hue, AWS S3, Apache Nifi, Python and MySQL.

Data Engineer

2012 – 2014

Project 4

Client: UMG,US                              

Project Type: Web application

Tools: MySQL, AWS

Role: Data Engineer

Period: From Sep 2012

Roles and Responsibilities:

  • Implementing Business logic for enhancements and developing web services.
  • Troubleshoot critical production issues.
  • Involved on enhancements and application developments
  • Technologies: Java, struts, Hibernate, AWS, Web Services (SOAP), Glassfish Server and MySql.

Data Engineer

2011 – 2012

Project 5

Client : FOX,US                              

Project Type : Web application

Role : Data Engineer

Period : From July 2011

Roles and Responsibilities:

  • Implementing Business logic for developing web services.
  • Responsible for Build and deployment.
  • Technologies: Java, Spring MVC, Restful web services, JSON, MyBatis, IBM Netezza and Glassfish Server.

Certifications

Education

2008 – 2010

Master in Computer Science

 JNTC

Customer testimonials

"Top notch candidates, helped me save a lot of time by providing the most qualified resources. They also simplify payroll, find solutions that work for you, and come at a really competitive price point"
"CoCreator perfectly understood the role we required and helped us find us the perfect AWS candidates, saving us plenty of time and resources. They work well and have provided an excellent service."