Apply for Data Engineer at Roposo.com in Gurgaon
Chat
X
loading
work
place

FIND JOBS

Post a Job
Roposo.com

Data Engineer

Roposo.com

2 - 5 Years   |   15 - 20 LPA   |   Gurgaon
Resume Buddy
Create a professional resume and increase your chances of landing this job.
Resume Buddy
Analyze your resume and get a detailed feedback to make it better.
Mock Interview
Get a realistic interview experience and feedback while at home using our AI-based tool.
Job Description
About the company:
A strong team filled with people focused on having fun while they work, Roposo is India's first ever social media platform where people express visually with homemade videos and photos. The app offers a seamless browsing experience with user-generated channels. Users can watch what is relevant to them and at the same time connect with a wide audience from around the globe. With its interesting post creation and editing tools, users can share their life, showcase their unrevealed talents, and voice their opinion on things that matter.
A brand of Relevant E-solutions headquartered in Gurgaon, Roposo is the brainchild of three IIT Delhi alumni, Mayank Bhangadia, Avinash Saxena and Kaushal Shubhank.

Website: https://www.roposo.com/

Designation
: Data Engineer

Job Location
: Gurgaon

Desired experience
: 2-5 Years'

Salary
: 15.00 LPA - 20.00 LPA

Education
: B.Tech (CS/IT), MCA

Job Description
:
- Understand & provide innovative solutions to business and product requirements using Big Data Architecture
- Take ownership of end-to-end data-pipeline including system design and integrating required Big Data tools & frameworks
-Implementing ETL processes and constructing data warehouse (HDFS, S3, etc.) at scale
- Developing highly performant Spark jobs for deriving data insights and building user preference, user segmentation, and recommendation engines
- Developing required querying and reporting tools for various business teams

Requirements:
- Working knowledge of Linux systems and distributed systems is a must.
- Knowledge of scripting language like python/SCALA
- Hands-on experience with writing Spark or Map-Reduce jobs and proficient understanding of distributed computing principles.
- Experience with Lambda Architecture and building required infrastructure.
- Proficient in more than one of Python, Java, Scala & Shell-Scripting.
- Experience with integration of data from multiple data sources
- Experience with various messaging systems, such as Kafka

Skills:
- Understanding of Spark and HDFS internals.
- Experience with Big Data querying tools, such as Hive, Pig, and Impala
- Experience with NoSQL databases, such as HBase, Cassandra, MongoDB
- Experience with resource managers such as YARN, Mesos
- Experience with stream-processing systems such as Storm, Spark-Streaming, etc.
- Knowledge of Lucene, SOLR, Elastic Search or any other similar technology is a plus.
- Prior experience in developing segmentation and recommendation systems is recommended.
- Experience with reporting tools like Apache Zepplin.


Education:

B.Tech/B.E., M.Tech./M.E., MCA

Work Experience:

2 - 5 Years

Salary

15 - 20 LPA

Industry

IT

Resume Buddy
Create a professional resume and increase your chances of landing this job.
Resume Buddy
Analyze your resume and get a detailed feedback to make it better.
Mock Interview
Get a realistic interview experience and feedback while at home using our AI-based tool.
Please fill in the information below.

I agree that my application will only be submitted after I complete the concise test sent to my inbox.
Confirm
Cancel
  • Be a part of AMCAT pool
  • Get Feedback Report
  • Get certified for job roles
This job requires you to have an active AMCAT subscription.
Buy AMCAT and open premium jobs for you.
Your AMCAT subscription expired on 24/04/2024. Extend your subscription and apply to unlimited premium jobs.
  • Be a part of AMCAT pool
  • Get Feedback Report
  • Get certified for job roles
× SUBSCRIPTION EXPIRED!
×
Filters
Our website uses cookies to ensure you get the best experience. By browsing the website you agree to our use of cookies. Please note, we do not collect sensitive data and child data. See Privacy Policy.