Description: Data Analyst to Develop Business Intelligence Application using Extract Transform Load to enable business users to perform business driven analysis. Analyze and Test pipeline that ingest data from different sources, and cleansing, transformations and performance tuned on Hadoop and Teradata platforms to fulfill high volume data business intelligence reporting requirements. Code and develop the strategies that improves the current existing process of Business Intelligence application development or testing that saves manual efforts and time. This includes developing python, shell scripts and web interface development. Test the Hadoop and Teradata application that built using MapReduce, Pig programming and BTEQ scripts using HiveQL and SQL. Data Transfer from Relational Database on to the HDFS cluster. Extracting data from AWS Cloud S3 buckets and loading them on to the Teradata server. Test and schedule Control M to schedule at specific timings as per the business needs, adding interdependencies between jobs, alerting/notifying as per the business requirement. Encrypting the sensitive data using java function and HP ProtectTools that secures the customer data as per the guidelines from Information Security team. Analyze data using SQL to implement Data Warehouse application on Teradata with the data from RDBMS and Hadoop. Development and validation of Teradata view and reports required for data insight used by business users. Filter data and merging data from various hive tables and storing them on HDFC. Providing data to visualization tools like Tableau, QlickView, SAP Business Objects. Prepare Test plan, Test Cases, perform End-To-End testing, Test review meeting, on data flow and validation of its transformation and finally preparing exit reports for the IT/business team approvals. Manage day-to-day production support issues ensuring effective and timely resolution. Duties includes tracking and resolving data issue, designing, testing and documenting the modification and functional enhancements. using Hadoop, Hive, Scala, Apache Spark, HBase, Shell Script, DevOps, Teradata, SQL, PL/SQL, SAP BO, Tableau, QlickView, QlickSense, IBM Cognos, MDX, MDS, DQS, ITIL, SAP ECC, R/3, Data Services, Datastage, SQL Server, SSRS, SSAS, MS PowerBI, Postgres SQL.
Required: Bachelor’s or higher degree in Computer Science, Computer Information Systems, Software Engineering, Information Technology, Computer Applications or a related field, or its equivalent, related to this field of work and the necessary competencies.