Greetings from Photon!!
Who are we?
Photon has emerged as one of the world’s largest and fastest-growing Digital Agencies. We work with 40% of the Fortune 100 on their Digital initiatives and are known for our ability to integrate Strategy Consulting, Creative Design, and Technology at scale. For a brief 1 minute video about us, you can check
Position: Hadoop DDeveloper
Location: Plano, TX
Description
Hadoop Developer is a hybrid role with experiences in database management, clustered compute, operating system integration, cloud concepts, storage solutions, application processing, and advanced monitoring techniques. The resource has experience in multiple disciplines including Cloud, Linux as well as Hadoop. Must be able to lead complex projects and competing priorities with a high level of technical acumen and strong communication skills.
Must have skills required: Hive, Sqoop, Impala and Spark
Required Skills:
- Experience with multiple large-scale Enterprise Hadoop or Big Data, Data Bricks, Cloudera, HD Insights, or other environments focused on operations, design, capacity planning, cluster set up, security, performance tuning and monitoring
- Experience with the full Cloudera CDH/CDP distribution to install, configure and monitor all services in the Cloudera stack
- Strong understanding of core Hadoop services such as HDFS, MapReduce, Kafka, Spark and Spark-Streaming, Hive, Impala, HBASE, Kudu, Sqoop, and Oozie
- Experience in administering, and supporting RHEL Linux operating systems, databases, and hardware in an enterprise environment
- Expertise in typical system administration and programming skills such as storage capacity management, debugging, performance tuning
- Proficient in shell scripting (e.g. Bash,ksh,etc)
- Experience in setup, configuration and management of security for Hadoop clusters using Kerberos with integration with LDAP/AD at an Enterprise level
- Experience with large enterprise scale, separation of resource concepts and the physical nature for those environments to operate (storage, memory, network, and compute).
Desired Skills:
- Experience in version control systems (Git)
- Experience with Spectrum Conductor or Databricks with Apache Spark
- Experience in different programming languages (Python, Java, R)
- Enterprise Database Administration Platform Experience
- Experience In Large Analytic Tools including SAS, Search, Machine Learning, Log Aggregation
- Experience with Hadoop distributions in the Cloud is a plus, AWS, Azure, Google
- Experience with the full Cloudera CDH distribution to install, configure and monitor all services in the CDH stack