jobs

C2FO

Leawood , KS
Apply Now

Data Engineer/Architect

C2FO
Leawood, KS Full-time
Posted on April 28, 2019

Data is the lifeblood of any technology company. At C2FO, we know the best way to ensure our success is to provide our decision-makers with the most accurate, most relevant data possible. Our Data Engineers use cutting-edge, open-source technologies to collect, process, and store our data. We use the flexibility of cloud-hosted computing to prototype and implement the latest and greatest data tools.   

The Data Engineer / Architect is responsible for nurturing an architecture that improves the quality, simplicity, and time to market of our data services to the enterprise. Working collaboratively with the Engineering, Product, Data & Machine Learning Sciences, this role owns the creation and architectural vision and supporting the team members executing against it.   You will also work closely with the most talented scientists, engineers, and data junkies in the Midwest to solve the most complicated long-term financial and working capital problems across multiple industry verticals (Retail, Technology, Health Care). 

Responsibilities

  • Work closely with Software Engineers, Data Scientists, and Business Analysts to meet data storage, access, and analysis needs
  • Collaborate with DevOps to deploy and maintain clustered computing across multi-cloud environments
  • Monitor, maintain, and enhance the existing data structures to ensure reliability and data integrity
  • Design and build large-scale, automated ingestions pipelines that integrate across multi-cloud platform and disparate data sources
  • Evaluate cutting-edge technologies for integration into existing technology stack
  • Building Cloud Native applications leveraging AWS services  
  • Leveraging continuous delivery tools  
  • Applying a Services First approach to building applications  
  • Using event­ or messaging ­based architectures 
  • Delivering multi­tenant applications
  • Leveraging zero downtime deployment patterns  

Requirements

  • 3+ years of working with large data sets
  • Adept at data analysis and pattern identification
  • Advanced proficiency with Big Data streaming and processing technologies (preferably Apache Spark)
  • Experience using technologies in the Hadoop
  • Ecosystem as well as Linux/Unix operating systems
  • Proficiency with a JVM language (preferably Scala) or experience coding within a functional programming paradigm
  • Experience with cloud-hosted computing (preferably AWS or GCP)
  • Experience using a version control system (preferably Git)
  • Curiosity and a drive to solve complex problems (where the answers can't always be found on Stack Overflow)
  • SQL and data modeling skills / experience
  • Strong soft skills (communication) and the ability to work well in a cross functional environment
  • Strong analytical, troubleshooting and problem-solving skills
  • Familiarity with Distributed Computing & Parallel Processing frameworks
  • Familiarity with data science & machine learning concepts/methodologies

 

Please apply at https://c2fo.com/company/careers/