Principal Data Engineer (Azure)

Other Jobs To Apply

No other job posts for this day.

1 month ago Be among the first 25 applicants

Get AI-powered advice on this job and more exclusive features.

Tiger Analytics is a global AI and analytics consulting firm. With data and technology at the core of our solutions, we are solving problems that eventually impact the lives of millions globally. Our culture is modeled around expertise and respect with a team-first mindset. Headquartered in Silicon Valley, you'll find our delivery centers across the globe and offices in multiple cities across India, the US, UK, Canada, and Singapore, including a

substantial remote global workforce.

We're Great Place to Work-Certified. Working at Tiger Analytics, you'll be at the heart of an AI revolution. You'll work with teams that push the boundaries of what is possible and build solutions that energize and inspire.

Requirements

Curious about the role? What your typical day would look like?

As a Principal Data Engineer (Azure), you would have hands on experience working on Azure as cloud, Databricks and some exposure / experience on Data Modelling. You will build and learn about a variety of analytics solutions & platforms, data lakes, modern data platforms, data fabric solutions, etc. using different Open Source, Big Data, and Cloud technologies on Microsoft Azure.

  • Design and build scalable & metadata-driven data ingestion pipelines (For Batch and Streaming Datasets)
  • Conceptualize and execute high-performance data processing for structured and unstructured data, and data

harmonization

  • Schedule, orchestrate, and validate pipelines
  • Design exception handling and log monitoring for debugging
  • Ideate with your peers to make tech stack and tools-related decisions
  • Interact and collaborate with multiple teams (Consulting / Data Science & App Dev) and various stakeholders to meet deadlines, to bring Analytical Solutions to life
  • What do we expect?

  • Experience in implementing Data Lake with technologies like Azure Data Factory (ADF), PySpark, Databricks, ADLS,
  • Azure SQL Database

  • A comprehensive foundation with working knowledge of Azure Synapse Analytics, Event Hub & Streaming
  • Analytics, Cosmos DB, and Purview

  • A passion for writing high-quality code and the code should be modular, scalable, and free of bugs (debugging
  • skills in SQL, Python, or Scala / Java).

  • Enthuse to collaborate with various stakeholders across the organization and take complete ownership of
  • deliverables.

  • Experience in using big data technologies like Hadoop, Spark, Airflow, NiFi, Kafka, Hive, Neo4J, Elastic Search
  • Adept understanding of different file formats like Delta Lake, Avro, Parquet, JSON, and CSV
  • Good knowledge of building and designing REST APIs with real-time experience working on Data Lake or
  • Lakehouse projects.

  • Experience in supporting BI and Data Science teams in consuming the data in a secure and governed manner
  • Certifications like Data Engineering on Microsoft Azure (DP-203) or Databricks Certified Developer (DE) are
  • valuable addition.

    Note : The designation will be commensurate with expertise and experience. Compensation packages are among the best in the industry.

    Job Requirement

  • Mandatory : Azure Data Factory (ADF), PySpark, Databricks, ADLS, Azure SQL Database
  • Optional : Azure Synapse Analytics, Event Hub & Streaming Analytics, Cosmos DB and Purview
  • Strong programming, unit testing & debugging skills in SQL, Python or Scala / Java
  • Some experience of using big data technologies like Hadoop, Spark, Airflow, NiFi, Kafka, Hive, Neo4J, Elastic Search
  • Good Understanding of different file formats like Delta Lake, Avro, Parquet, JSON and CSV
  • Experience of working in Agile projects and following DevOps processes with technologies like Git, Jenkins & Azure DevOps
  • Good to have :
  • Experience of working on Data Lake & Lakehouse projects
  • Experience of building REST services and implementing service-oriented architectures
  • Experience of supporting BI and Data Science teams in consuming the data in a secure and governed manner
  • Certifications like Data Engineering on Microsoft Azure (DP-203) or Databricks Certified Developer (DE)
  • Benefits

    This position offers an excellent opportunity for significant career development in a fast-growing and challenging entrepreneurial environment with a high degree of individual responsibility.

    Seniority level

    Seniority level

    Associate

    Employment type

    Employment type

    Full-time

    Job function

    Job function

    Information Technology

    Industries

    IT Services and IT Consulting

    Referrals increase your chances of interviewing at Tiger Analytics by 2x

    Get notified about new Data Engineer jobs in Toronto, Ontario, Canada .

    Software Engineer, Backend (All Levels / All Teams)

    Data Engineer, Investment Research and Data Science

    We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.

    #J-18808-Ljbffr

    Back to blog

    Common Interview Questions And Answers

    1. HOW DO YOU PLAN YOUR DAY?

    This is what this question poses: When do you focus and start working seriously? What are the hours you work optimally? Are you a night owl? A morning bird? Remote teams can be made up of people working on different shifts and around the world, so you won't necessarily be stuck in the 9-5 schedule if it's not for you...

    2. HOW DO YOU USE THE DIFFERENT COMMUNICATION TOOLS IN DIFFERENT SITUATIONS?

    When you're working on a remote team, there's no way to chat in the hallway between meetings or catch up on the latest project during an office carpool. Therefore, virtual communication will be absolutely essential to get your work done...

    3. WHAT IS "WORKING REMOTE" REALLY FOR YOU?

    Many people want to work remotely because of the flexibility it allows. You can work anywhere and at any time of the day...

    4. WHAT DO YOU NEED IN YOUR PHYSICAL WORKSPACE TO SUCCEED IN YOUR WORK?

    With this question, companies are looking to see what equipment they may need to provide you with and to verify how aware you are of what remote working could mean for you physically and logistically...

    5. HOW DO YOU PROCESS INFORMATION?

    Several years ago, I was working in a team to plan a big event. My supervisor made us all work as a team before the big day. One of our activities has been to find out how each of us processes information...

    6. HOW DO YOU MANAGE THE CALENDAR AND THE PROGRAM? WHICH APPLICATIONS / SYSTEM DO YOU USE?

    Or you may receive even more specific questions, such as: What's on your calendar? Do you plan blocks of time to do certain types of work? Do you have an open calendar that everyone can see?...

    7. HOW DO YOU ORGANIZE FILES, LINKS, AND TABS ON YOUR COMPUTER?

    Just like your schedule, how you track files and other information is very important. After all, everything is digital!...

    8. HOW TO PRIORITIZE WORK?

    The day I watched Marie Forleo's film separating the important from the urgent, my life changed. Not all remote jobs start fast, but most of them are...

    9. HOW DO YOU PREPARE FOR A MEETING AND PREPARE A MEETING? WHAT DO YOU SEE HAPPENING DURING THE MEETING?

    Just as communication is essential when working remotely, so is organization. Because you won't have those opportunities in the elevator or a casual conversation in the lunchroom, you should take advantage of the little time you have in a video or phone conference...

    10. HOW DO YOU USE TECHNOLOGY ON A DAILY BASIS, IN YOUR WORK AND FOR YOUR PLEASURE?

    This is a great question because it shows your comfort level with technology, which is very important for a remote worker because you will be working with technology over time...