Diono

Data Engineer
Since 2010, Surendar has been working professionally in the fields he loves, software and Big Data application using technologies such as Cloudera Distribution, Databricks, Spark Scala and Pyspark. He Has a good exposure to AWS Cloud and Airflow as workflow management tool along with programming language such as Java, Python and SQL.Before that, he was pursuing the past decade at Velammal Engineering College studying Electrical and Electronics Engineering. All in all, Surendar is an engaging, intense communicator with a passion for knowledge and understanding.
CANDIDATE ID
AG007
YEARS OF EXPERIENCE
11
+ years
EXPERT IN
Databricks
Spark Scala
Pyspark
Java
Python
SQL
Data bricks.
Spark
Oracle
MySQL
PL/SQL
Cloudera Hadoop
MSSQL

Skills

Libraries/APIs

Pandas, SQL, Matplotlib, Data Bricks, Spark, Scala.

 

Other

Statistical Analysis, Machine Learning, Statistics, Physics, Modeling, Full-stack, Software Development, Data Visualization, Data Engineering, Visualization, AWS, ECS, Optimization, Computer Vision

 

Languages

Python, SQL, JavaScript

 

Platforms

Amazon Web Services (AWS), Linux, Windows.

 

Framework

Angular symphony, Laravel, Redux, Bootstrap, JSON Web Tokens(JWT), PHP Unit, YARN, Swagger, Express.js, serverless Framework, React Native,

AngularJs, Protractor.

 

Storage

MYSQL, MongoDB, Elastic Search, Amazon S3 (AWS S3),Databricks, Cloudera Hadoop, Spark, Kafka, Talend, OWB.

Industry Expertise

Information Technology

Professional experience

Data Engineer Lead

Agilisium

2020– Current

Project 1

Project Title : NBC Digital

Client :Amgen

Role: Software Engineer

Tools : Databricks, PySpark, Python, Airflow

Period: February2020 – Present

AWS Services : S3, Redshift, Athena

Technologies: Amazon Web Services (AWS), Databricks, Pyspark, Airflow.

Role and Responsibilities

  • Upgrade the existing data pipeline to new Framework (parameterized jobs)
  • Design Airflow DAGs and integrate with master DAG or schedule to trigger individually.
  • Write python scripts to automate the upgrade and table creation part.
  • Create jobs using python to generate reports on weekly basis and schedule in Databricks.
  • Design, coding in python/Pyspark and testing of new Jobs/Reports based on business requirement.
  • Take care of general tasks related to Databricks Admin (add users, access to clusters, creating jobs etc.)
  • Automate admin related jobs which give in-sites on the usage of Databricks resources.
  • Identify non-compliant Databricks resources(cluster/jobs) and enabling the owners to make corrective actions.
  • Creating generic scripts for Databricks migration(cluster, jobs, workspace, user etc.

Software Engineer

Kanini Software Solutions (formerly GISbiz)

2019 – 2020

Project 2

Client : Cengage

Project Type : Web application

Tools : Spark Scala, Hive, Kafka,

Role: Software Engineer

Duration :From June 2019

Role and Responsibilities

  • Processing a multiline record file (single record in 2 line in a file) using spark and to push them to HDFS. Create Hive table on top of it and make analysis over it. Also creating Spark job to generate report using the data. Implementing the file processing job with multi thread to read files parallel.
  • Creating a dynamic Spark job to load different files from various financial data providers to the corresponding table. The job would take the provider and a date as parameter, and dynamically select the fields required and dump the data to the desired target table
  • Capture Real-Time images from webcam and stream the images to Spark using Kafka. The images are captured and pushed to Kafka whereas spark consumes the streaming images and process it using Image Reorganization module designed by another team.

Developer

Western Union TEC (formerly Opus Software Technologies)                      

2017 to2019

Project 3

Project Name : WU-CAP

Tools : SQOOP, Hive, Impala, Talend, Spark, Kafka

Solution Environment: UNIX, DB2, Hadoop 2.5.0-CDH 5.3.3

Role: Software Engineer

Role and Responsibilities

  • Understanding the source system
  • Requirement and Data Analysis based on business requirements and prepare source to target mapping document.
  • Developing Sqoop scripts to extract data from source system
  • Developing Hive script for the ETL
  • Design and development of Metadata driven job for Sqoop and Target loading in Talend.
  • Develop various reports from hive and expose them to business team.

Developer

Opus Software Solutions

2016 to 2017

Project 4

Project Name : Linq3

Client : Linq3

Project Type : Web application

Tools : SQOOP, Hive, Impala, Talend, Spark, Kafka

Solution Environment: UNIX, DB2, Hadoop 2.5.0-CDH 5.3.3

Role: Software Engineer

Duration : May-2016 to July-2017

Role and Responsibilities

  • Understanding the source system
  • Requirement and Data Analysis based on business requirements and prepare source to target mapping document.
  • Developing Sqoop scripts to extract data from source system
  • Developing Hive script for the ETL
  • Design and development of Metadata driven job for Sqoop and Target loading in Talend.
  • Develop various reports from hive and expose them to business team.

Developer

Opus Software Solutions

2014 to 2016

Project 5

Project Name : UAE Exchange –MasterCard

Client : MasterCard

Project Type : Web application

Tools : Eclipse 4.4.2 (Luna), TortoiseSVN

Technologies :Core Java, PL-SQL

Solution Environment: UNIX, DB2, Hadoop 2.5.0-CDH 5.3.3

Role: Software Engineer

Duration : May-2014 to April-2016

Role and Responsibilities

  • Designing and development of code/SPs for new requirements.
  • Development and modification of iReports based on the requirement.
  • Analysis and development of DB migration scripts.
  • Defect fixing and testing.
  • Release management.

Developer

Opus Software Solutions

2014 to 2014

Project 6

Project Name : Team CPM

Client : MasterCard

Project Type : Web application

Tools :Eclipse 4.4.2 (Luna), TortoiseSVN

Technologies :Core Java, PL-SQL

Environment: Oracle,MS-SQL, Windows, UNIX.

Role: Software Engineer

Duration : Jan-2014 to May-2014

Role and Responsibilities

  • Designing and development SPs for new requirements.
  • Gathering functional requirements for OS Ticketing
  • Tuning SQL Queries
  • Status Monitoring and Reporting.

Developer

Opus Software Solutions

2014 to 2014

Project 7

Project Name : :TSYS DAX Integration

Client :NetSpend Corporation

Project Type : Web application

Tools :OWB,SQL-Developer TortoiseSVN

Technologies :Core Java, PL-SQL

Environment: Oracle,MS-SQL, Windows, UNIX.

Role: Developer

Duration : Nov-2013 to Jan-2014

Role and Responsibilities

  • Understanding the requirements through LLD
  • Developing mappings based on business logic mentioned in LLD
  • Performing unit testing for the mappings developed.

Developer

Tata Consultancy Services

2010 to 2013

Project 8

Project Name : NSDL

Client : National Securities and Depository Limited

Project Type : Web application

Tools : OWB, OEM, MS-VSS, Sub version (SVN)

Technologies : Core Java, PL-SQL

Environment: : Oracle10g2, Oracle 11g v2,Windows, UNIX.

Role: Developer

Duration : March-2010 to November- 2013

Role and Responsibilities

  • Designing, Testing and deployment of OWB mapping
  • Designing and development of code/SPs for new requirements and CRs.
  • Performing unit testing of the ETL maps, SPs, developed codes, Shell-Scripts and jobs.
  • Performance monitoring and tuning the application.
  • Involved in Bug fixing/enhancements to the existing functionalities.
  • Release management., Status Monitoring and Reporting.

Certifications

Education

2004 – 2009

Bachelor of Engineering in Electrical and Electronics Engineering

Customer testimonials

"Top notch candidates, helped me save a lot of time by providing the most qualified resources. They also simplify payroll, find solutions that work for you, and come at a really competitive price point"
"CoCreator perfectly understood the role we required and helped us find us the perfect AWS candidates, saving us plenty of time and resources. They work well and have provided an excellent service."