Roxana

Specialist
Since 2013, Janani has been working in various integrations which includes processing from heterogeneous sources to process through Alteryx Designer to extract, transform and generate reports in various output formats like TDE (Tableau Data Extract)/Tableau dashboard, Salesforce objects to Veeva Next. She is experienced in CDL processing with Databricks, Databricks Workspace User Interface, Managing Databricks Notebooks, Analysis on Data retention – iDNA data through Airflow & Gitlab data files.
CANDIDATE ID
AG010
YEARS OF EXPERIENCE
8
+ years
EXPERT IN
Statistical Analysis
Oracle
SQL
Jenkins
Data Extraction
Data Mapping
ETL
Data Cleansing
Unix AWS Redshift
Informatica
Alteryx
XML
Autosys

Skills

Libraries/APIs

Web services - Soap UI/DIH Console/IICS

Other

Statistical Analysis, Databricks, Statistics,Data Visualization, Unix, Jenkins, XML, Airflow Salesforce, Jira, Subversion,Bitbucket, Gitlab

Languages

Unix, SQL, JavaScript, Databricks

Paradigms

Data Science

Platforms

Amazon Web Services (AWS), Informatica, Alteryx

Storage

MYSQL, Oracle 11g PL, Redshift

Professional experience

Technical Lead

2020– Current

Project 1

Project Title : NBC Digital

Client :AMGEN

Project Type :Web application

Tools: Alteryx Designer, Redshift, Databricks

Period: From Nov 2020  

Role and Responsibilities

  • Lead the team(JAPAC region) on the Integration level activities based on the Business requirements.
  • Have built adynamic code in Alteryx Designer to be executed quarterly for the Country wise Sales data.
  • Assist and lead in the ongoing development of technical best practices for data movement, data quality, data cleansing and other code related activities in Alteryx.
  • Handled project individually with having close conversations with clients related to requirements, demos and extracts.
  • Have worked on Standardizing the code to fetch inputs dynamically in Alteryx based upon the dashboard                    requirements.
  • Decommissioning the tables in BYOD.
  • Have involved in multiple project related presentations on the requirement specification.
  • Have created and incorporated delta tables in different layers of CDL (Landing, Unified,Stitched, Unstitched and Stitched Athena) in Databricks.
  • Have processed the files from S3 to cdl layer tables through Airflow with respect to the dags(adhoc/existing) and validated the changes with respect to the Gitlab code for the current execution in data bricks notebooks.

Technologies: Alteryx Designer/ Server, Redshift, Databricks

Senior Software Engineer

HCL Technologies

2018– 2020

Project 2

Client : Merck& Co. Thryve Digital Health LLP

Tools: Informatica, Soap UI, Oracle, Unix

Duration : Oct 2018 to Oct 2019.

Role: Senior Software Engineer

Period: June 2018 – Nov 2020  

Role and Responsibilities

  • Understanding the requirements through the Integration Data Mapping.
  • Prepare the Entity Relationship model for the Relational tables.
  • To understand the business logic of the existing system and create end to end mapping flow in Informatica.
  • Worked on Encryption and decryption at the source and target system based on the public key.
  • Worked in Data Integration Hub(Informatica)
  • To create topics on the Relational tables/ Files.
  • Publications (automated workflows and Custom Workflows)
  • Generation of subscriptions based on the dependent jobs and Invoking through Informatica Data Integration Hub.
  • Scheduling the jobs through DIH Console based on the Delta / Full Load.
  • Worked on Web services to process the data in Web consumer transformation through Soap Request/Response.
  • Worked on Rest APIs to post the xml/source data through curl scripting.
  • Handled a POC in IICS for Web connectors/Service Connectors using ICAI.
  • Have worked in Reusable mapping and sessions/Session configuration and all levels of mappings using complex transformations in Informatica Power center.
  • Have worked in-depth in Creating, Developing and Scheduling the Integrations (Publications/Subscriptions) through Informatica Data Integration Hub.

Technologies: Informatica Power center, Informatica Data Integration Hub, Soap UI, Oracle, IICS, Unix

Associate

Cognizant

2013– 2018

Project 3

Title : Greenfield data Lake.

Project Type: Web application

Client :Blue Shield of California, Toyota Motor Sales                            

Role: Associate

Period: Nov 2013 – May 2018  

Roles and Responsibilities:

  • From the initial progress on framing the approach/design document, worked on all the development activities in all SCD types, preparing UTC/UTL for the integration level testing.
  • Documented all technical specification documents for all ETL processes and perform unit tests on all processes and prepare required programs and scripts.
  • Assist in the ongoing development of technical best practices for data movement, data quality, data cleansing and other ETL-related activities.
  • Participation and contribution in quality assurance walk through of ETL components.
  • Work closely with Project Manager to develop and update the task plan for ETL work and to keep the manager aware of any critical task issues and dependencies on other teams.
  • Ensure the ETL code delivered is running, conforms to specifications and design guidelines. Unit testing ETL code to ensure it can be delivered and run in a system testing environment Review of ETL design documents and working closely with Review team.
  • Monitor all business requirements and validate all designs and schedule all ETL processes and prepare documents for all data flow diagrams.
  • Reviewed and analyzed functional requirements, mapping documents, problem solving and trouble shooting.
  • Performed unit testing at various levels of the ETL and actively involved in team code reviews.
  • Worked on optimizing the mappings by creating re-usable transformations and Mapplets. Created debugging and performance tuning of sources, targets, mappings, transformations and sessions.
  • Worked extensively with the connected and unconnected lookup Transformations using dynamic cache.

Technologies: Informatica Powercenter, Oracle SQL, Unix

Certifications

  • Have received Developer Essentials and DeveloperFoundation Badges – Partner Training- Databricks
  • Have completed and received Certifications inBig Data -Hadoop, MapReduce, Pig, Spark, Fundamentals of Python, Data Science,Business Analytics & Artificial Intelligence (theoretical training) fromAmity University
  • Published White paper on the usage of XML andJAVA Transformation in Informatica.
  • Participated and suggested ideas for IdeaInnovation Campaign (Cognizant)

Education

Customer testimonials

"Top notch candidates, helped me save a lot of time by providing the most qualified resources. They also simplify payroll, find solutions that work for you, and come at a really competitive price point"
"CoCreator perfectly understood the role we required and helped us find us the perfect AWS candidates, saving us plenty of time and resources. They work well and have provided an excellent service."