We are looking for multiple contractors for a six-month project, Data Engineers, who may work remotely. The role
will help us build and maintain a Snowflake data warehouse that captures
historical and ongoing business data. The data in the warehouse need to be
cleaned, transformed, and merged to generate a holistic view of the business,
which will serve as the foundation for downstream value-added in-depth analytics
and machine learning.
Responsibilities
Build Snowflake data pipeline/ETL jobs to ingest data from text files,
relational and non-SQL databases.
Perform ongoing maintenance and administration
of the Snowflake warehouse.
Design and implement Snowflake schemas including
tables, views, and materialized views.
Understand the data from a business perspective and write SQL
scripts based on business logic.
Manage data sharing and data access for
business users.
Optimize data storage and warehouse query
performance.
Qualifications and Experience
Experience with Snowflake warehouse, data pipeline, Apache Airflow, and ETL
tools in an enterprise environment.
Strong experience working with large data
sets from multiple sources. Performed tasks including data cleansing, data
merge, and aggregation.
Strong SQL and Python programming skills. Knowledge in R
programming is a plus.
Experience MySQL, MongoDB, Amazon AWS
services (S3, SQS, Lamba, etc.).
Experience with business intelligence tool is a plus.
Experience with machine learning is a plus.
To apply for the position, please send your
resume to us at jobs@sqlytics.com.