Roles and Responsibilities: The person will be primarily responsible for: Creating and maintaining data streams and processes within the Hadoop and EDW platforms Profile and analyze various data sources Design and implement data ingestion strategy Design, develop, code, test and debug complex new code packages or make significant enhancements to existing code/packages. Interact with business users to understand and document the requirements. Required Skills: Demonstrated experience with the Hadoop ecosystem (Hive, Pig, MapReduce, Spark etc.) Demonstrated experience with ETL tools like Informatica Experience on scheduling tools like Tivoli or Autosys. Development experience in one or more of the following languages: Java, Python and Scala Demonstrated experience with databases like Teradata/DB2/Oracle. Good understanding and experience with working in Hadoop and Big data technologies Good knowledge of database structures, theories, principles and practices. - provided by Dice Associated topics: .net, application, c c++, c++, develop, java, matlab, software developer, software engineer, software programmer
* The salary listed in the header is an estimate based on salary data for similar jobs in the same area. Salary or compensation data found in the job description is accurate.