IRIS AUTOMATION PVT LTD - Big Data Developer - New Delhi
Chat
X
loading
work
place

FIND JOBS

Post a Job
 IRIS AUTOMATION PVT LTD

Big Data Developer

IRIS AUTOMATION PVT LTD

0 - 3 Years   |   Confidential   |   New Delhi
Resume Buddy
Create a professional resume and increase your chances of landing this job.
Resume Buddy
Analyze your resume and get a detailed feedback to make it better.
Mock Interview
Get a realistic interview experience and feedback while at home using our AI-based tool.
Job Description
About the Company:
Iris Software is a global Information Technology services organization offering high-quality solutions to businesses. It services the information technology requirements of Fortune 1000 companies by utilizing specialized domain knowledge, best-of-breed technologies, rapidly deployable proprietary frameworks/solutions and flexible engagement models.

We have been serving our customers for over 25 years from our offices in New York, Toronto, New Delhi and our headquarters, Edison, New Jersey. We service our customers across two broad business lines - Financial Services and Enterprise Services.

Our Financial Services group serves its Financial Services customers with deep domain knowledge & wide execution experience in many areas such as allocation & settlement, credit & market risk, market & reference data, & municipal bond underwriting. The Enterprise Services group works with customers in multiple industries - Life Sciences, Industrial goods manufacturing, Professional Services companies and Hi-Tech organizations.

Iris provides thought leadership to solve business problems creatively by conceptualizing & delivering uncommon solutions leveraging existing technologies & new computing paradigms such as Digital Transformation, Machine Learning, Analytics & Insights, Cloud Computing & Process Automation. Iris continually innovates to accelerate outcomes such as time-to-market, lower costs, ease of maintenance & reduced TCO.

Websitewww.irissoftware.com

Profile Offered: Big Data Developer

Job location: Delhi

Course Specialization: ME/M.Tech, BCA, MCA, BE/B.Tech

Desired Experience: 0 to 3 Years

Competencies:
Hands-on individual responsible for producing excellent quality of code, adhering to expected coding standards and industry best practices.
Good experience in Spark , hive, Mapreduce.
Good knowledge of Spring, Hibernate, Caching Frameworks, Memory Management.
Problem- solving/ Trouble shooting skills.
High levels of ownership and commitment on deliverables.
Strong Communication Skills - Should be interact with client stakeholders to probe a technical problem or clarify requirement specifications.
Knowledge of RDBMS & Business Intelligence solutions preferred
Passionate Team Player with Can-do attitude
Ability to be gracefully persuasive in discussions to get things done. Ability to drive quality.
Ability to see big picture, think innovative and suggest out of box solutions.


Education:

B.Tech/B.E., BCA, M.Tech./M.E., MCA

Work Experience:

0 - 3 Years

Salary

Confidential

Industry

IT

Resume Buddy
Create a professional resume and increase your chances of landing this job.
Resume Buddy
Analyze your resume and get a detailed feedback to make it better.
Mock Interview
Get a realistic interview experience and feedback while at home using our AI-based tool.
Please fill in the information below.

I agree that my application will only be submitted after I complete the concise test sent to my inbox.
Confirm
Cancel
  • Be a part of AMCAT pool
  • Get Feedback Report
  • Get certified for job roles
This job requires you to have an active AMCAT subscription.
Buy AMCAT and open premium jobs for you.
Your AMCAT subscription expired on 30/07/2025. Extend your subscription and apply to unlimited premium jobs.
  • Be a part of AMCAT pool
  • Get Feedback Report
  • Get certified for job roles
× SUBSCRIPTION EXPIRED!
×
Filters
Our website uses cookies to ensure you get the best experience. By browsing the website you agree to our use of cookies. Please note, we do not collect sensitive data and child data. See Privacy Policy.