TITLE: Senior Application Developer
LOCATION: Maryland Heights, MO
JOB SUMMARY: Design, develop, and implement real-time data ingestion using technology including Kafka, Apache NiFi and Hadoop batch layer technology, HIVE, HBASE, ETL development, and integration systems. Design and develop batch and real-time Big Data systems using technology including SQL, Talend and Hadoop batch layer technology, Spark SQL, Python, HIVE, Impala, ETL development, and integration systems. Design platform architecture including platform utilities, patterns for deploying new services, and new applications. Provide guidance to operations group to understand the impact of architectural changes on daily data flow and operational effort. Define the reference architecture to achieve central operations capabilities, advanced analytics, and Smiths Central Administration. Resolve issues regarding development, operations, implementations, and system status. Build, maintain, test, and evaluate complex solutions in close collaboration with other engineers and analytic partners. Translate requirements into new big data solutions, maintenance and execution of existing processes, as well as continuous improvement. Define and architect platform portal core services. Design and architect analytics module, starting with data ingestion and storage. Provide guidance to operations group to understand the impact of architectural changes on daily data flow and operational effort. Design interactive visualization tools for data and system analysis. Design and implement reporting dashboards that show key business metrics. Identify actionable insights, suggest recommendations, and influence the direction of the business by effectively communicating results to cross functional groups. Implement ETL processes in order to maintain, improve, clean, and manipulate data. Profile data to measure quality, integrity, accuracy, and completeness. Develop and implement tools, scripts, queries, and applications for ETL/ELT and data operations. A hybrid (in office and remote work) arrangement is available.
EDUCATION/REQUIREMENTS: Bachelor's degree, or foreign equivalent, in Engineering (any), Computer Science, Computer Applications, or a related field and 5 years of experience with handling large-scale software development and integration projects using SDLC. Must have 4 years of experience utilizing programming languages including Java, Scala, Python, Pig, Hive, SQL, ETL Development, or Talend; utilizing Hadoop Ecosystem tools including Scala, Spark, Hive, or Kafka. Must have 3 years of experience working with Data Modelling OLTP or OLAP; and designing and developing transactional or analytical reports.