Job details »
· Build reliable, scalable, CI/CD-driven streaming and batch data engineering pipelines.
· Perform requirement analysis and co-ordinate with project managers and development team to drive the delivery cycle.
· Source to target data analysis and mappings.
· Responsible for design and development of big data solutions.
· Follow agile development methodologies to deliver solutions and product features by following DevOps practices.
· Define plan, script, and execute tests and provide post-live support.
· Peer review output
· Actively document, record and share knowledge of our systems.
· Collaborate with leadership and stakeholders to ensure data quality and integrity in DWH Cloud platforms for BI/Analytical reporting.
· Pre-meditate any risks well in advance and communicate to stakeholders to formulate the plan of action to mitigate any risks.
· Actively carrying out data analytics / data profiling to recommend improvements to enhance data quality, process improvement and business rule improvement
· Assist with the coaching and development of other engineers
· 3+ years of total IT experience
· Strong Experience on designing and developing Data Pipelines for Data Ingestion or Transformation using Java.
· Excellent experience in using SQL to analyse data for data source systems.
· Excellent Experience in building or maintaining ETL processes from a wide variety of data sources using SQL.
· Experience in GCP and hands-on with data analysis on GCP tools like
· Dataflow/Data pipeline
· Stream-processing systems: Kafka/Pub/Sub.
· Should have Developed and implemented JUnit Classes to perform Unit Testing and PMD.
· Experience on Web Services (RestAPI) to interact with 3rd party Applications to publish and consume data.
· Experience in version controlling code using tools like GitHub.
· Experience in ticketing tools like JIRA/ServiceNow
· Experience with GCP Ingest Services like Compute Engine and Stack Driver Logging.
· Linux shell scripting
· Exposure to containerization and related technologies (e.g., Docker, Kubernetes)
· Exposure to aspects of DevOps (source control, continuous integration, deployments, etc.)
· System level understanding - Data structures, algorithms, distributed storage & c compute.
· Can-do attitude on solving complex business problems, good interpersonal and teamwork skills
· Any GCP Certification in a plus
At Cloudely, we work with a single mission: Transform the way clients experience Product & Implementation, Development, and Support. Growth is a journey and never a destination. We are constantly thriving to grow in gaining the trust of clients globally in offering services across Salesforce, Oracle, Robotic Process Automation, DevOps, Web, and Mobile Programming to name a few. And we are just getting started!
We have fabulous opportunities for you to grow along with us!
At Cloudely, you will get what you are looking for: the scope to learn, prove and grow.