2021-11-13 12:18:44
2021-11-18
Mid (3-5 Yrs)
W2 - Permanent
India
No
No
Job details »
At Cloudely, we work with a single mission: Transform the way clients experience Product & Implementation, Development, and Support.
Growth is a journey and never a destination. We are constantly thriving to grow in gaining the trust of clients globally in offering services across Salesforce, Oracle, Robotic Process Automation, DevOps, Web, and Mobile Programming to name a few. And we are just getting started!
We have fabulous opportunities for you to grow along with us!
At Cloudely, you will get what you are looking for: the scope to learn, prove and grow. We are now actively seeking success-hungry candidates who want to grow in the domain of Data.
Role: Data Engineer
Experience: 3+ years
Primary Skills
- Strong in SQL/RDBMS, Strong in programming languages like Java/Python/Scala, experience in one or more big data tools like Hadoop, Kafka, Spark, Beam
Secondary Skills
- AWS, GCP, knowledge in pipeline orchestration tools like Airflow, Composer, or equivalent services like AWS Glue
Responsibilities:
1. Develop and deploy batch and streaming data pipelines in cloud ecosystem.
2. Automation of manual processes and performance tuning of existing pipelines.
3. Data loading and processing from multiple source locations into Data lake, Datamart and Datawarehouse while keeping cost, performance, and security in mind.
4. Automate and develop analytics tools and occasionally involve in visualization set up processes.
5. Develop processes for migrating on-premises data to cloud environment
Mandatory Skills :
1. 4+ years of experience in IT programming, application/product development.
2. At least 2 years' experience working in any of the Bigdata cloud ecosystems like AWS, GCP, PCF, Azure etc
3. Strong in SQL/RDBMS and any of the programming languages like Java/Python/Scala.
4. Experience with one or more of the big data tools like Hadoop, Kafka, Spark, Beam etc.,
5. Good experience with AWS services like EC2, EMR, Redshift or equivalent GCP services like Compute engine, Big query, Dataflow etc.
6. Good knowledge in Data structures.
7. Experience working with multiple OS like Windows, Linux and Unix and good scripting knowledge including Shell, Bash.
8. Basic knowledge in any of the web and server-side frameworks like AngularJS, ReactJS, NodeJS, Django
9. Good knowledge in NOSQL DB concepts.
10. Very good communication and team player skills.
Preferable Skills :
1. Working knowledge on MQs, publisher-consumer models, streaming data processing
2. Experience with Devops tools like Github, Jenkins, Monitoring systems etc.
3. Good knowledge in pipeline orchestration tools like Airflow, Composer or equivalent services like AWS Glue.,
The way to your dream job and organization is just a click away. Share your resume at [email protected]. To know more about us, please visit www.cloudely.com.