The PNC Financial Services Group, Inc. seeks a Data Engineer in Overland Park, KS with the ability to telecommute with appropriate telecommuting systems for up to three days per week, with a minimum of two days per week in the office. Within the Data Management Business Intelligence Development department, responsible for performing analytical tasks on vast amounts of structured and unstructured data to extract actionable business insights. Specific duties include: (i) lead the development, support, and implementation of data solutions for multiple applications in order to meet business objectives and user requirements; (ii) leverage technical knowledge and industry experience in order to design, develop and maintain technology solutions; (iii) lead the development of software applications for data requirement analysis and the development of data process preparation for targeted data solutions; (iv) lead the development of software applications for the monitoring of big data analytics infrastructure, educate the business community, and encourage the use of best practices; and (v) implement data governance and audit policies for big data ad-hoc environments.
Bachelor’s degree in Electronics Engineering, Electrical Engineering, Computer Information Systems or Information Systems Management plus 3 years of experience creating technical requirements for banking payment systems using JIRA, Confluence and Azure Dev Ops (ADO) is required. Must have experience with: (i) extracting data from SQL Server, Teradata and Oracle databases and transforming and loading it on Central repository using ETL tools Hive, Impala, Python, Spark, SSIS, and Informatica; (ii) implementing a wide range of operations including data processing, access, storage, governance, security and operations using Hive, Impala, Cloudera manager, and Cloudera Navigator; (iii) identifying ways to improve data reliability, efficiency and performance using Unravel; (iv) importing and exporting data using Sqoop from HDFS to relational database systems, including Sqlserver, Oracle, and Teradata, and vice versa; (v) developing data pipelines using Flume to extract data from weblogs and store in HDFS and writing Shell scripts to monitor the health check of Hadoop daemon services; (vi) optimizing existing algorithms in Hadoop using Spark context, Spark-SQL, Data Frames and RDDs; (vii) developing workflows using Oozie to automate the tasks of loading data into HDFS and preprocessing with Pyspark; and (viii) data analysis, data mapping techniques and creating data visualizations for end users using OBIEE and Tableau.
40 hours/week, 8:00 a.m.–5:00 p.m. Interested individuals apply online at http://www.pnc.com using keyword R136348. PNC provides equal employment opportunity to qualified persons regardless of race, color, sex, religion, national origin, age, sexual orientation, gender identity, disability, veteran status, or other categories protected by law.