Roles and Responsibilities:
- Must have experience in Presales, Consulting, RFPs (Big Data, Analytics/Reporting & AI projects)
- Ability to run end to end program as an IC or in coordinate with junior team members.
- Work with customers and suggest appropriate technical design.
- Selecting and integrating any Big Data tools and frameworks required to provide requested capabilities.
- Implementing ETL process
- Work with customer to suggest relevant cloud platforms for migrations/implementations.
- Monitoring performance and advising any necessary infrastructure changes
- Defining data retention policies
- Design Data Governance, Data Strategies, Analytics strategies
- Conducting POCs
Skillsets Required:
- Proficient understanding of distributed computing principles
- Management of Hadoop cluster and cloud platforms such as Snowflake
- Experience with building stream-processing systems, using solutions such as Storm or Spark-Streaming
- Good knowledge of Big Data querying tools, such as Pig, Hive, and Impala
- Experience with integration of data from multiple data sources
- Knowledge of various ETL techniques and frameworks, such as Flume
- Experience with various messaging systems, such as Kafka or RabbitMQ
- Experience with Big Data ML toolkits, such as Mahout, SparkML, or H2O
- Good understanding of Lambda Architecture, along with its advantages and drawbacks
- Experience with Cloudera/MapR/Hortonworks
- Exposure to working on Reporting tools such as Qlik, Tableau, Power BI etc
- Exposure to data science concepts
- Knowledge of at least one cloud such as – Azure/AWS/GCP
- Performing any CMMi related activities
- People Management skills
Experience: 10-16 Years
Job Location: Hyderabad