Job Description |
- 8+ years of relevant consulting or industry experience
- 2+ years in a technical or functional lead role
- Experience working independently with minimal guidance
- Strong problem solving and troubleshooting skills with experience exercising mature judgment
- Proven experience effectively prioritizing workload to meet deadlines and work objectives
- Demonstrated ability to write clearly, succinctly, and in a manner that appeals to a wide audience
- Proficiency in word processing, spreadsheet, and presentation creation tools, as well as Internet research tools
- 6 years of relevant technology architecture consulting or industry experience to include Information delivery, Analytics and Business Intelligence based on data from hybrid of Hadoop Distributed File System (HDFS), non-relational (e.g. NoSQL, MongoDB, Cassandra) and relational Data Warehouses.
- 3 years of hands-on experience with data lake implementations, core modernization and data ingestion.
- 1 years of hands-on experience with Cloud using Amazon Web Services (AWS), Microsoft Azure, or Google Cloud Platform (GCP)
- At least 3 years hands on working experience with big data technologies; MapReduce, Pig, Hive, HBase, Sqoop, Spark, Flume, YARN, Kafka, Storm etc.
- Experience working with commercial distributions of HDFS (Hortonworks, Cloudera, Pivotal HD, MapR)
|