OutdefineJoin for free

EXPLORE PREMIER
OPPORTUNITIES

As a skilled professional seeking career growth, you deserve access to the best job opportunities available. Join Outdefine's Trusted community today and apply to premier job openings with leading enterprises globally. Set your own rate, keep all your pay, and enjoy the benefits of a fee-free experience.

career-heroJoin now
Back to jobs
Outdefine partnerVerified partner
Azure Data EngineerOutdefine Partner
Web3
10-50
San Francisco, CA, USA
Apply Now

About the job

Overview:

Job Description: Cleans, prepares, and optimizes data for further analysis and modelling. Designs, develops, optimizes, and maintains data architecture and pipelines that adhere to Data Pipeline (ie ELT) principles and business goals.

Roles and Responsibilities:  Designs, develops, optimizes, and maintains data architecture and pipelines that adhere to ELT principles and business goals.  Solves complex data problems to deliver insights that helps business achieve its goals.  Creates data products for engineers, analysts, and data scientist team members to accelerate their productivity.  Engineer effective features for modelling in close collaboration with data scientists and businesses.  Leads the evaluation, implementation and deployment of emerging tools and process for analytics data engineering to improve productivity and quality.  Partners with machine learning engineers, BI, and solutions architects to develop technical architectures for strategic enterprise projects and initiatives.  Fosters a culture of sharing, re-use, design for scale stability, and operational efficiency of data and analytical solutions.  Advises, consults, mentors, and coach other data and analytic professionals on data standards and practices.  Develops and delivers communication and education plans on analytic data engineering capabilities, standards, and processes.  Learns about machine learning, data science, computer vision, artificial intelligence, statistics, and/or applied mathematics as necessary to carry out role effectively.

Required Skills:  5-10 years of experience required.  Experience with designing and maintaining data warehouses and/or data lakes with big data technologies such as Spark/Databricks, or distributed databases, like Redshift and Snowflake, and experience with housing, accessing, and transforming data in a variety of relational databases.  Experience in building data pipelines and deploying/maintaining them following modern DE best practices (e.g., DBT, Airflow, Spark, Python OSS Data Ecosystem).  Knowledge of Software Engineering fundamentals and software development tooling (e.g., Git, CI/CD, JIRA) and familiarity with the Linux operating system and the Bash/Z shell.  Experience with cloud database technologies (e.g., Azure) and developing solutions on cloud computing services and infrastructure in the data and analytics space.  Basic familiarity with BI tools (e.g., Alteryx, Tableau, Power BI, Looker).  Expertise in ELT and data analysis, SQL primarily.  Conceptual knowledge of data and analytics, such as dimensional modelling, reporting tools, data governance, and structured and unstructured data.

Skills required
Data structuresData pipelineAzurePythonSpark
Employee location
San Francisco, CA, USA
Experience level
Not specified
Workplace type
onsite
Job type
full time contract
Compensation
$15 - 25 /hr
Currency
🇺🇲USD

Become a trusted member, apply to jobs, and earn token rewards

backgroundtopCreate a profile

Create and customize your member profile.

backgroundtopComplete assessment

Earn 500 Outdefine tokens for becoming trusted member and completing your assessment.

backgroundtopApply for jobs

Once you are a Trusted Member you can start applying to jobs.

Apply Now