LSports
We’re searching for a

Software Architect

R&DTel Aviv, IL

About The Position

LSports is a leading global provider of sports data, dedicated to revolutionizing the industry through innovative solutions. We excel in sports data collection and analysis, advanced data management, and cutting-edge services like AI-based sports tips and high-quality sports visualization. As the sports data industry continues to grow, LSports remains at the forefront, delivering real-time solutions.

If you share our love of sports and tech, you've got the passion and will to better the sports-tech and data industries - join the team. We are looking for a highly motivated Software Architect.

Responsibilities

  • Define and lead the architecture of complex software systems and platforms, from design to deployment.
  • Collaborate with cross-functional teams (Data, ML, CV, DevOps) to align architecture with product and business goals.
  • Design and oversee the development of high-throughput, low-latency services and data pipelines.
  • Guide the implementation of best practices in software engineering, including system design, scalability, reliability, testing, and monitoring.
  • Evaluate and adopt technologies (e.g., Apache Iceberg, event-driven architectures, observability platforms) to improve system performance and development velocity.
  • Mentor engineers and contribute to architectural knowledge sharing across the company.

Requirements

  • At least 10 years of experience in a data engineering role, including 2+ years as a Software Architect with ownership over company-wide architecture decisions.
  • Proven experience designing and implementing large-scale, Big Data infrastructure from scratch in a cloud-native environment (GCP preferred).
  • Excellent proficiency in data modeling, including conceptual, logical, and physical modeling for both analytical and real-time use cases.
  • Strong hands-on experience with:
  • Data lake and/or warehouse technologies, with Apache Iceberg experience required (e.g., Iceberg, Delta Lake, BigQuery, ClickHouse)
  • ETL/ELT frameworks and orchestrators (e.g., Airflow, dbt, Dagster)
  • Real-time streaming technologies (e.g., Kafka, Pub/Sub)
  • Data observability and quality monitoring solutions
  • Excellent proficiency in SQL, and in either Python or JavaScript.
  • Experience designing efficient data extraction and ingestion processes from multiple sources and handling large-scale, high-volume datasets.
  • Demonstrated ability to build and maintain infrastructure optimized for performance, uptime, and cost, with awareness of AI/ML infrastructure requirements.
  • Experience working with ML pipelines and AI-enabled data workflows, including support for Generative AI initiatives (e.g., content generation, vector search, model training pipelines) — or strong motivation to learn and lead in this space.
  • Excellent communication skills in English, with the ability to clearly document and explain architectural decisions to technical and non-technical audiences.
  • Fast learner with strong multitasking abilities; capable of managing several cross-functional initiatives simultaneously.
  • Willingness to work on-site in Ashkelon once a week.

Advantage:

  • Experience leading POCs and tool selection processes.
  • Familiarity with Databricks, LLM pipelines, or vector databases is a strong plus.
Apply

Apply

Plug your product into
the best sports data feeds
in the world

Contact us

BOOK A MEETING