Senior Data Engineer

Job description

Barcelona|Spain or Remote, Full time.

What's Landbot

Landbot is a no-code communication automation platform that empowers non-technical users to automate business processes through conversational experiences. COVID19 is accelerating digitalization at both the consumer and enterprise level, so the adoption of communication automation solutions therefore is skyrocketing.

In three years Landbot has gone from a chatbot builder MVP to a product with thousands of recurring customers who use it daily to get more and better leads across the whole world. This is possible thanks to our builder, a centerpiece of the product, which allows non-technical people to build engaging conversational experiences.

What's the opportunity

The Senior Data Engineer is a data-savvy passionate about data whose mission is to create the best strategy to develop, maintain, and test the infrastructure (either on-premise or Cloud) for data generation, cleaning, and ingestion. Making it is accessible to different stakeholders, like Data Analysts and Business teammates.

We’re looking for someone who wants to build something huge and that people love because it makes their lives simpler: we’ve built a company for people who want to do impactful work, surrounded by great people. We have a well-defined, mature culture — and we’re always making it better.

About the role

  • Design highly scalable, concurrent and robust data architectures.
  • Collect and process data from hundreds of sources daily (half a million events of hundreds of types being produced daily).
  • Work with the DevOps defining infrastructure as a code.
  • Solve the needs for data consumers.
  • Curate massive amounts of data and make it accessible to users with different roles.
  • Ensure product and technical features are delivered to spec and on-time.
  • Design and implement data features in collaboration with product owners, reporting analysts/data analysts, and business partners within an Agile / Scrum methodology.

Benefits 😍

    • A competitive salary package in a fast-growing start-up.
    • Great culture & working atmosphere, we have a young, upbeat, and international work environment.
    • Live and work in sunny Barcelona.
    • Remote friendly.
    • Friday happy hours after our weekly team meetings.
    • Open vacation policy and flexible holidays so you can take time off when you need it.
    • Bonus transport tickets, educational content, and all you need to feel empowered.
    • Referral Bonus if you bring other talented people like you.
    • Top-notch work equipment.

Job requirements

  • Proficiency with relational SQL and NoSQL databases, including Mysql and MongoDB.
  • Expertise with data pipeline and workflow management tools for ETL: Extracting, Transforming, and Loading data.
  • Knowledge in Elastic Search and Big data Google Cloud
  • Ability with Docker and container orchestration tools like Kubernetes.
  • Solid programming skills with Python to develop scripts for data cleaning and data ingestion
  • Familiarity working with GCP
  • Batch processing with Hadoop or Spark
  • You are fluent in English and Spanish (it's a must).
  • Eligibility to work in Spain

Bonus skills & attributes

  • Machine Learning knowledge
  • Sci-py, R, Scala
  • Querying tools: Hive, Presto, Drill, Impala, SparkSQL, etc.
  • Streaming processing tools like Kafka / Kinesis, Storm, Flink, Spark Streaming, etc
  • Concurrent and distributed programming: reactive extensions, AKKA
  • Graph processing: Spark GraphX, Apache Giraph
  • Lambda architecture, kappa architecture
  • YARN
  • Cloudera, MapR knowledge
  • Notebook (Jupiter, Zeppelin, Databricks)
  • SparkML, Deeplearning4j, H2O, Tensorflow, etc