2021-10-28 19:18:08
2021-11-08
Architect (7+ Yrs)
Corp - Corp, W2 - Permanent, W2 - Contract
United States
No
No
Job details »
At Cloudely, we work with a single mission: Transform the way clients experience Product & Implementation, Development, and Support.
Growth is a journey and never a destination. We are constantly thriving to grow in gaining the trust of clients globally in offering services across Salesforce, Oracle, Robotic Process Automation, DevOps, Web, and Mobile Programming to name a few. And we are just getting started!
We have fabulous opportunities for you to grow along with us!
At Cloudely, you will get what you are looking for: the scope to learn, prove and grow.
Job Title: Snowflake Data Engineer
Location: Coraopolis, PA (Any nearby locations closer to Coraopolis)
Duration: 6-12 Months
Job Description:
· Snowflake data engineers will be responsible for designing and developing various artifacts in the effort of Data migration from legacy databases to Snowflake cloud data warehouse.
· A solid experience and understanding of architecting, designing and operationalization of large-scale data and analytics solutions on Snowflake Cloud Data Warehouse is a must.
· Developing solutions using combination of Python and Snowflake’s Snow SQL writing SQL queries against Snowflake.
· Developing scripts using Unix, Python etc. to do Extract, Load and Transform data
· Working knowledge of various Azure components in the process of building data pipelines
· Provide support for Data Warehouse issues such data load problems, transformation translation problems
· Translate requirements for BI and Reporting to Database design and reporting design
· Understanding data transformation and translation requirements and which tools to leverage to get the job done
· Testing and clearly document implementations, so others can easily understand the requirements, implementation, and test conditions.
Basic Qualifications:
· Minimum 2 year of developing a fully operational production grade large scale data solution on Snowflake Data Warehouse.
· 3 years of hands on experience with building productionized data ingestion and processing pipelines using Spark, Python
· 2 years of hands-on experience designing and implementing production grade data warehousing solutions on large scale data technologies such as Teradata, Oracle or DB2
· Expertise and excellent understanding of Snowflake Internals and integration of Snowflake with other data processing and reporting technologies
· Excellent presentation and communication skills, both written and verbal
· Ability to problem solve and architect in an environment with unclear requirements
Preferred Skills:
· Minimum 2 year of experience developing data solutions on Snowflake.
· Should be able to write Python scripts to integrate with Snowflake
· Should have worked on Unix environment
· Should be able to write stored procedures and complex queries using Snowflake.
· Exposure to Azure data components (ADLS, Data Factory, Azure SQL, Data Bricks)
The way to your dream job and organization is just a click away. Share your resume at [email protected]. To know more about us, please visit www.cloudely.com.