Responsible for design and administration of data Center and cloud solutions for disaster recovery. Also, implemented and maintained the physical standby databases for high availability.
Responsible for architecture and data management of the migration projects... I am also involved in the blueprint phase.
Developed mappings using Informatica Power Center and loaded the data into the target Oracle server. Created the mappings to load the data from various sources to the target systems.
Define and implement the ETL strategy for the data warehouse using Informatica Power Exchange, Oracle, MS SQL, and Talend.
Big Data/Hadoop Developer
Eastwood High School
Design and implement a POC to migrate data from legacy system to Hadoop infrastructure. Created MapReduce jobs to load data into HBase tables, and use Cassandra to store the data in the cloud.
Designed and implemented Big data analytics platform using NoSQL, HDFS, HBase, Sqoop, Kafka, and Storm.
Responsible for data integration and data exploration using Talend open source, Apache Spark and Cassandra. IT is used to perform analytics on the BigData platform.
Designed and developed a data warehouse using MongoDB and Cassandra to create a service to IT and MDM. Worked with the Architect to develop a proof of concept for the analytics.
Experience in data visualization and data integration using Hadoop ecosystem. This includes the ability to work with the EDW team to understand the business requirements.
Implemented Spark for data analytics and integration with Teradata. Created HBase tables to load large sets of structured, semi-structured and unstructured data coming from UNIX, NoSQL and a variety of portfolios.