2024-01-22 10:33:45
2024-01-22
Architect (7+ Yrs)
W2 - Permanent
India
No
No
Job details »
Role: Data Engineer
Experience: 7+ years
Job Experience Requirements:
- Bachelor’s degree in the areas of Computer Science, Engineering, Information Systems, Business, or equivalent field of study required
- 7+ years of experience in working with data solutions.
- 3+ years of experience coding in Python, or Scala or similar scripting language.
- 3+ years of experience in developing data pipelines in AWS Cloud Platform (preferred), Azure, or Snowflake at scale.
- 2+ years Experience in designing and implementing data ingestion with real-time data streaming tools like Kafka, Kinesis or any similar tools.
- SAP/Salesforce or other cloud integrations are preferred.
- 3+ years experience working with MPP databases such as Snowflake (Preferred) , Redshift or similar MPP databases.
- 2+ years experience working with Serverless ETL processes (Lambda, AWS Glue, Matillion or similar)
- 1+ years experience with big data technologies like EMR, Hadoop, Spark, Cassandra, MongoDB or other open source big data tools.
- Knowledge of professional software engineering best practices for the full software development life cycle, including coding standards, code reviews, source control management, build processes, testing, and operations.
- Experience designing, documenting, and defending designs for key components in large distributed computing systems
- Demonstrated ability to learn new technologies quickly and independently
- Demonstrated ability to achieve stretch goals in a very innovative and fast paced environment
- Ability to handle multiple competing priorities in a fast-paced environment
- Excellent verbal and written communication skills, especially in technical communications
- Strong interpersonal skills and a desire to work collaboratively
- Experience participating in an Agile software development team, e.g. SCRUM
Job Responsibilities:
- Responsible for the building, deployment, and maintenance of critical scalable Data Pipelines to assemble large, complex sets of data that meet non-functional and functional business requirements
- Work closely with SMEs, Data Modeler, Architects, Analysts and other team members on requirements to build scalable real time/near real time/batch data solutions.
- Contributes design, code, configurations, and documentation for components that manage data ingestion, real time streaming, batch processing, data extraction, transformation, and loading into Data Lake/Cloud Data Warehouse/MPP (Snowflake/Redshift/similar Technologies ) .
- Owns one or more key components of the infrastructure and works to continually improve it, identifying gaps and improving the platform’s quality, robustness, maintainability, and speed.
- Cross-trains other team members on technologies being developed, while also continuously learning new technologies from other team members.
- Interacts with technical teams across Cepheid and ensures that solutions meet customer requirements in terms of functionality, performance, availability, scalability, and reliability.
- Performs development, QA, and dev-ops roles as needed to ensure total end to end responsibility of solutions.
- Keep up with current trends in big data and Analytics , evaluate tools and pace yourself for innovation.
- Mentor Junior engineers ,create necessary documentation and Run-books while still being able to deliver on goals
About us:
At Cloudely, we work with a single mission: Transform the way clients experience Product & Implementation, Development, and Support. Growth is a journey and never a destination. We are constantly thriving to grow in gaining the trust of clients globally in offering services across Salesforce, Oracle, Robotic Process Automation, DevOps, Web, and Mobile Programming to name a few. And we are just getting started!We have fabulous opportunities for you to grow along with us! At Cloudely, you will get what you are looking for: the scope to learn, prove and grow.The way to your dream job and organization is just a click away. Share your resume at [email protected]. To know more about us, please visit www.cloudely.com.