Bryce

Principal Data Solution Architect
Around 16+ years of experience as Big Data Engineer /Data Engineer and Data Analyst including designing, developing and implementation of data models for enterprise-level applications and systems. Experience in development of Oracle PL/SQL Scripts, Stored Procedures and Triggers for business logic implementation. Experience in Data transformation, Data mapping from source to target database schemas, Data Cleansing procedures. Performing extensive data profiling and analysis for detecting and correcting inaccurate data from the databases and to track data quality. Experience in Performance Tuning and query optimization techniques in transactional and Data Warehouse Environments. Experience in Technical consulting and end-to-end delivery with architecture, data modeling, data governance and design - development - implementation of solutions. Experienced in using distributed computing architectures such as AWS products.
CANDIDATE ID
AG002
YEARS OF EXPERIENCE
16
+ years
EXPERT IN
SQL
SQL Server
SQL queries
optimizing the queries in Oracle
OLTP
OLAP
Oracle
Teradata
Kinesis
Redshift
Snowflake
Databricks
Snaplogic
SSIS.Extensive
EC2
Red-shift
EMR
Elastic search
Hadoop
Python
Spark
MapReduce

Skills

Operating System   

Windows 98/NT/2000/XP, Ubuntu.

Big Data Ecosystems 

Hadoop, Map Reduce, Hive, Pyspark, Apache Kafka, Airflow

Languages 

Python, spark, PL/SQL, SQL, shell script and perl

Cloud Platforms 

Amazon Web Services (AWS), EMR, Glue, EC2, S3, Redshift, Kinesis, snowflake and S3

RDBMS 

Oracle, SQL Server and MySQL.

Tools & Utilities

 Toad, SQL Developer, Aginity.

ETL/DW Tools 

Snaplogic, Databricks, Tableau

Version Controls 

 GIT

Professional experience

Agilisium Data

Dec 2019 – Till date

Project 1

Project Title : NBC Digital

Client : NBC

Environment : Tools Spark, Snowflake, Redshift, S3, EMR, Databricks, Airflow

Role and Responsibilities

  • Point of contact for all automation as part of data science (recommendation engine).
  • Analyzing raw data in S3 and extracting the relevant fields to Redshift/Delta tables for exploration.
  • Gathering requirements and provide POC, explore new tools and creating templates and models.
  • Building recommendation models POC with Sage maker and personalize.
  • Sending work status to Project Manager

Project 2

Project title : UMG Digital Sales Reports.

Client : UMG, US.

Duration : Oct 2018 to Oct 2019.

Environment : redshift, oracle, snap logic, python, PL/SQL, shell script and BI

Role and Responsibilities

  • Designed the initial Architecture for the project.
  • Data Modelling for OLAP applications with appropriate documentation.
  • Worked on Snaplogic transformation rules for data migration from OLTP to warehouse.
  • Developed operational data store to design data marts and enterprise data warehouses.
  • Designed data flows that (ETL) extract, transform, and load data by optimizing snaplogic by enhancing script snap.
  • Developed the required data warehouse model using Star schema for the generalized model
  • Responsible for data mappings, data validation, and deployment of applications into AWS environment

Project 3

Title : Greenfield data Lake.

Client : Pluto, US.

Duration : Apr 2018 to till Sep 2018.

Environment : Kinesis, redshift, S3, GLUE, EC2, Snowplow, lambda, Athena, Looker

Roles and Responsibilities:

  • Designed the architecture using AWS services.
  • High quality delivery from inception to completion of applications developed using Python, , Streaming, AWS, Redshift, S3.
  • Built and deployed python code in lambda for data transformation in kinesis.
  • Designed DB model
  • Wrote scripts to load data from unstructured data source to Redshift.
  • Created GLUE jobs to load data from flat files to Redshift.
  • Created Athena tables to read the data from RAW file.
  • Supported reporting team to build Realtime dashboard.
  • Extended Table columns for custom fields as per business requirements.
  • Provided solutions for improve query performance in Redshift.

Project 4

Title : Cloud BI.

Client : Corsair, US.

Duration : Jun 2017 to Mar 2018.

Environment : Redshift, Snaplogic, Tableau

Roles and Responsibilities:

  • Reviewed the existing Architecture and built the new physical architecture as per business requirement.
  • Refactor the existing OLTP data model to fit OLAP.
  • Built aggregation tables for faster reporting.
  • Supported ETL team in building the ETL platform.
  • Worked with reporting team to build the dashboard on Tableau.

HPE

Jan 2011 to till May 2017.        

Project 5

Title : GIMM/IMM.

Client : UMG, US.

Duration : Jan 2011 to till May 2017.

Environment : SQL Server, MSTR, informatica

Roles and Responsibilities:

  • In this Organization working as Developer DBA as well as Production DBA
  • Working experience on performance related issues(Query tuning ,Disk, memory Network related issues)
  • Excellent experience on Indexing & Fragmentation Related Activities
  • Maintaining the Partitioning Tables for Historical DATA in Data Ware housing Environment.
  • Monitoring Snapshot replication process every day, This replication job will invoked by trigger once load got completed from ETL Team end
  • Working experience on partition maintenance (Partition Tables on IMM & GIMM Environments) activity for IMM & GIMM for weekly & Monthly activities.
  • Handled full backups for big database which contains 25TB of size.
  • Creating and managing Users, Roles and Groups and handling database security policies
  • Setting backup strategies based on business criticality and Responsible for restore strategies
  • Performing SQL server health check and Troubleshooting jobs, checking error logs, taking backups, monitoring log spaces
  • Working experience in Migration of databases in side by side and in-placed servers
  • Responsible for space allocation and refreshing the databases as per client requirement
  • Rebuilding / monitoring the indexes at regular intervals for better performance
  • Export or Import data from other data sources like flat files using SQL Import/Export and SSIS

IGate Global Solution 

Jul 2009 to till Jan 2011.                                   

Project 6

Title : Dashboard Application

Client : Royal Bank of Canada, Canada

Duration : Jul 2009 to till Jan 2011.

Environment : SQL Server, MSTR, informatica.

Roles and Responsibilities

  • Involved in development and UAT the interface scripts.
  • Involved in project enhancements and Maintenance.
  • Debugging the code and fixing the bugs

IGate Global Solution

Dec 2007 to till May 2009 

Project 7 

Title : ModelN – CRM Application

Client : Avago Technology, Malaysia

Duration : Dec 2007 to till May 2009.

Environment : SQL Server, MSTR, informatica.

Roles and Responsibilities

  • Involved in development and UAT the interface scripts.
  • Involved in project enhancements and Maintenance.
  • Debugging the code and fixing the bugs.

IGate Global Solution

Jan 2006 to till Dec 2007.

Project 8

Title : Report Server – Banking operations

Client : ING Vysya Bank Ltd., India

Duration : Jan 2006 to till Dec 2007.

Environment : SQL Server, MSTR, informatica.

Module Name : Cash Transaction Report

Roles and Responsibilities:

  • Involved in development and UAT the report and the extraction procedure.
  • Wrote scripts to extract data from profile on Daily/monthly bases and loaded to Report server.
  • Involved in project enhancements and Maintenance. Worked in onsite for Production support.
  • Debugging the code and fixing the bugs.

Module Name : SLMG – Non-Performing Assets Utility

Roles and Responsibilities

  • Involved in development and UAT the report and the extraction procedure.
  • Wrote scripts to extract data from profile on monthly bases to load in to SLMG module.
  • Involved in project enhancements and Maintenance. Worked in onsite for Production support.
  • Debugging the code and fixing the bugs.

Certifications

Education

2005

B.Sc Electronics  

Bharathiar University

Customer testimonials

"Top notch candidates, helped me save a lot of time by providing the most qualified resources. They also simplify payroll, find solutions that work for you, and come at a really competitive price point"
"CoCreator perfectly understood the role we required and helped us find us the perfect AWS candidates, saving us plenty of time and resources. They work well and have provided an excellent service."