Share with friends...

AWS Data Engineer  

Cloudely, Inc   Remote, Oregon

POSTING DATE
2021-12-21 08:26:51 
START DATE
2021-12-31 
EXPERIENCE
Senior (5-7 Yrs) 
PREFERRED EMPLOYMENT
W2 - Contract 
COUNTRY
United States 
RELOCATION PROVIDED
No  
REMOTE WORK?
Yes  

Job details  »

At Cloudely, we work with a single mission: Transform the way clients experience Product & Implementation, Development, and Support.

Growth is a journey and never a destination. We are constantly thriving to grow in gaining the trust of clients globally in offering services across Salesforce, Oracle, Robotic Process Automation, DevOps, Web, and Mobile Programming to name a few. And we are just getting started!

We have fabulous opportunities for you to grow along with us! 

At Cloudely, you will get what you are looking for: the scope to learn, prove and grow. We are now actively seeking success-hungry candidates who want to grow in the domain of  Engineer.

Role: Sr. AWS Data Engineer

Location: Remote

Experience: 5-8  yrs experience)

Job Description:

Responsibilities

  • Is responsible for the development, unit testing and implementation of data integration solutions
  • Responsible for importing, cleansing, transforming, validating and analyzing data with the purpose of understanding or making conclusions from the data for data modeling, data integration and decision-making purposes.
  • The primary role will be developing and unit testing ETL mapping jobs based on the requirement document (HLDLLDSTM) which will be provided by Lead or Client and fixing defects in QA and UAT process.
  • Should also possess good knowledge on Databases, Linux UNIX environment to perform his daily tasks and should have exposure to data model and concepts of Data warehouse to create ETL jobs for dimension and fact tables.
  • Create LLDSTM documents, develop, test, and deploy data integration processes using choice of ETL
  • Adopt integration standards, best practices, ABC framework etc while creating ETL jobs.
  • Performs data validation, cleansing and analysis.
  • Tests, debugs, and documents ETL processes, SQL queries, and stored procedures
  • Analyzes data from various sources, including databases and flat files
  • ETL development, deployment, optimization, support and defect fixes.
  • Develop error handling processes
  • Create and implement scheduling strategy
  • Possess good knowledge of Agile and Waterfall methodologies.

Qualifications:

At least 3-5 years of experience with design, development, automation, and support of applications to

  • Extract, Transform, and Load data
  • Must have worked on AWS S3, SNS/SQS, Glue, Athena, Redshift, Lambda
  • Proficient working on Databricks Spark
  • Proficient in SQL and SQL scripts
  • Familiarity with Big Data Solutions (e.g. Data Lake / Delta Lake)
  • Ability to work in an agile environment
  • Build and maintain solutions with latest technologies including PySpark, Hadoop, Hive, Redshift, Scala, Python, Airflow and Traditional SQL procedures and ETL processes

Role: Jr. AWS Data Engineer

Location: Remote

Experience: 2-5  yrs experience)

Job Description:

Responsibilities

  • Is responsible for the development, unit testing and implementation of data integration solutions
  • Responsible for importing, cleansing, transforming, validating and analyzing data with the purpose of understanding or making conclusions from the data for data modeling, data integration and decision-making purposes.
  • The primary role will be developing and unit testing ETL mapping jobs based on the requirement document (HLDLLDSTM) which will be provided by Lead or Client and fixing defects in QA and UAT process.
  • Should also possess good knowledge on Databases, Linux UNIX environment to perform his daily tasks and should have exposure to data model and concepts of Data warehouse to create ETL jobs for dimension and fact tables.
  • Create LLDSTM documents, develop, test, and deploy data integration processes using choice of ETL
  • Adopt integration standards, best practices, ABC framework etc while creating ETL jobs.
  • Performs data validation, cleansing and analysis.
  • Tests, debugs, and documents ETL processes, SQL queries, and stored procedures
  • Analyzes data from various sources, including databases and flat files
  • ETL development, deployment, optimization, support and defect fixes.
  • Develop error handling processes
  • Create and implement scheduling strategy
  • Possess good knowledge of Agile and Waterfall methodologies.

Qualifications:

At least 3-5 years of experience with design, development, automation, and support of applications to

  • Extract, Transform, and Load data
  • Must have worked on AWS S3, SNS/SQS, Glue, Athena, Redshift, Lambda
  • Proficient working on Databricks Spark
  • Proficient in SQL and SQL scripts
  • Familiarity with Big Data Solutions (e.g. Data Lake / Delta Lake)
  • Ability to work in an agile environment
  • Build and maintain solutions with latest technologies including PySpark, Hadoop, Hive, Redshift, Scala, Python, Airflow and Traditional SQL procedures and ETL processes

The way to your dream job and organization is just a click away. Share your resume at [email protected]. To know more about us, please visit www.cloudely.com.


Recent job opportunities from Cloudely, Inc  »

Cloudely, Inc  ·   California Only
Country
United States
Experience Level
Architect (7+ Yrs)
Preferred Employment
Corp - Corp, W2 - Contract, 1099 - Contract
Posted on
Nov 29, 2023
Cloudely, Inc  ·   United States
Country
United States
Experience Level
Architect (7+ Yrs)
Preferred Employment
Corp - Corp, W2 - Contract, 1099 - Contract, H1B Transfer
Posted on
Oct 11, 2023
Cloudely, Inc  ·   Remote only
Country
United States
Experience Level
Architect (7+ Yrs)
Preferred Employment
Corp - Corp, W2 - Permanent, W2 - Contract, 1099 - Contract
Posted on
Oct 06, 2023
Cloudely, Inc  ·   Remote - USA
Country
United States
Experience Level
Architect (7+ Yrs)
Preferred Employment
Corp - Corp, W2 - Contract, 1099 - Contract
Posted on
Aug 11, 2023
Cloudely, Inc  ·   Remote - USA
Country
United States
Experience Level
Architect (7+ Yrs)
Preferred Employment
Corp - Corp, W2 - Contract, 1099 - Contract
Posted on
Aug 10, 2023

Apply now  »

Sponsored
Konfeeg is another no-code app builder where you can design and create a full-service business application in days without any code.
Sponsored
Looking for CPQ implementation? Look no further. Cloudely now offers Salesforce CPQ solution that can be implemented in less than 8* weeks.