"Job Description
Responsibilities:
Work as a senior developer for the Hadoop enterprise data platform
Develop components for big data platforms related to data ingestion, storage, transformations and analytics
Test data components and ensure they meet requirements
Ensure the unit of code and design follow reference architecture standards
Incorporate new and existing technology platforms with defined use
Collaborate with other developers and architects to enhance the capabilities of the platforms
Assist in cross-team collaboration and design discussions
Provide sandbox functionality to users of the datasets to build models for analysis
Skills:
10+ years of overall experience in Information Technology and Systems
4+ years of experience with Big data distribution and analytics, specifically in Cloudera Ecosystem. Must consist of hands-on experience with a full project life-cycle implementation with the following features:
Data volumes in terabytes with vertical and horizontal scalability
Structured and unstructured/semi-structured data
High speed querying
Analytics use cases including in-memory processing
Highly secure system
Metadata management
4+ years of experience in data modeling, relational and dimensional
7+ years of experience in multiple relational database systems (SQL Server, Oracle, DB2)
Experience with high volume systems
5+ years Java programming experience
4+ years scripting experience: Scala/Python
4+ years Hive programming experience
3+ year Avro/Parquet experience
4+ years of Impala experience
Experience in Cloudera distributed system
Experience in Financial projects
Experience in high volume Data migration projects"